You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/08/01 18:49:25 UTC

Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2249

See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2249/display/redirect>

Changes:


------------------------------------------
[...truncated 347.50 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 01, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 01, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 01, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 01, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 01, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 01, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 01, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 01, 2021 6:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 01, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 01, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash 945a905d9b31422943ced3df538b02093ee3fff7205727a79c56ee5cb9972fe9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-lFqQXZsxQilDztPfU4sCCT7j__cgVyennFbuXLmXL-k.pb
    Aug 01, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 01, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-8dj8jYMMHXszIcz-CHqEr7RZVD7HTRzv2xQ3RMYUFJg.jar
    Aug 01, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4223654319046912460.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-8dI2QUvYb13Oi_VftL8N8nsdNMcMn2WPHJEFvfvJvTo.jar
    Aug 01, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 01, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 01, 2021 6:45:06 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 01, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 01, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 01, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 01, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 01, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-01_11_45_07-6927065466802167640?project=apache-beam-testing
    Aug 01, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-01_11_45_07-6927065466802167640
    Aug 01, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-01_11_45_07-6927065466802167640
    Aug 01, 2021 6:45:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-01T18:45:10.818Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 01, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:45:16.967Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 01, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:45:17.784Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 01, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:45:17.835Z: Expanding GroupByKey operations into optimizable parts.
    Aug 01, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:45:17.880Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 01, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:45:17.969Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 01, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:45:18.003Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 01, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:45:18.046Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 01, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:45:18.081Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 01, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:45:18.497Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 01, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:45:18.565Z: Starting 5 workers in us-central1-c...
    Aug 01, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:45:23.514Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 01, 2021 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:46:03.269Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 01, 2021 6:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:46:29.941Z: Workers have started successfully.
    Aug 01, 2021 6:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:46:29.966Z: Workers have started successfully.
    Aug 01, 2021 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:47:01.376Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 01, 2021 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:47:01.515Z: Cleaning up.
    Aug 01, 2021 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:47:01.591Z: Stopping worker pool...
    Aug 01, 2021 6:49:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:49:16.289Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 01, 2021 6:49:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-01T18:49:16.337Z: Worker pool stopped.
    Aug 01, 2021 6:49:22 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-01_11_45_07-6927065466802167640 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0cfe4f0b-3619-4975-9006-1d2c9a168688 and timestamp: 2021-08-01T18:49:22.738000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.992

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 01, 2021 6:49:23 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 32.348 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 3s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/al46upymae7ga

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2499

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2499/display/redirect>

Changes:


------------------------------------------
[...truncated 340.28 KB...]
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 3d8d79e60883a9ead96d04050fca6504
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Oct 03, 2021 6:44:55 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Oct 03, 2021 6:44:56 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 03, 2021 6:44:56 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Oct 03, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@613927329]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@688000764]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 03, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Oct 03, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 03, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 03, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Oct 03, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5527341037863145142.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ldCjmuz0bi2z703eO-kB9-TVejm2o8cJKIVQDSJHXi0.jar
    Oct 03, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Oct 03, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Oct 03, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Oct 03, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
    Oct 03, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 245 files cached, 5 files newly uploaded in 0 seconds
    Oct 03, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 03, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104890 bytes, hash c6679677be01e2c3c4bb0d49fc728ab65ece3941587bdd46075f16e07952e5a1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xmeWd74B4sPEuw1J_HKKtl7OOUFYe91GB18W4HlS5aE.pb
    Oct 03, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 03, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Oct 03, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Oct 03, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Oct 03, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-02_23_45_09-11935331738148933063?project=apache-beam-testing
    Oct 03, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-10-02_23_45_09-11935331738148933063
    Oct 03, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-02_23_45_09-11935331738148933063
    Oct 03, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-10-03T06:45:12.715Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 03, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:45:18.220Z: Worker configuration: e2-standard-2 in us-central1-a.
    Oct 03, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:45:18.870Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 03, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:45:18.906Z: Expanding GroupByKey operations into optimizable parts.
    Oct 03, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:45:18.932Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 03, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:45:18.995Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 03, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:45:19.022Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 03, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:45:19.042Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Oct 03, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:45:19.385Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 03, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:45:19.461Z: Starting 5 workers in us-central1-a...
    Oct 03, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:45:35.885Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 03, 2021 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:46:01.146Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 03, 2021 6:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:46:27.186Z: Workers have started successfully.
    Oct 03, 2021 6:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:46:27.216Z: Workers have started successfully.
    Oct 03, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:46:55.302Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 03, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:46:55.426Z: Cleaning up.
    Oct 03, 2021 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:46:55.501Z: Stopping worker pool...
    Oct 03, 2021 6:49:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:49:23.087Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 03, 2021 6:49:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T06:49:23.136Z: Worker pool stopped.
    Oct 03, 2021 6:49:30 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-10-02_23_45_09-11935331738148933063 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): aa063f72-c3a3-4f32-81b6-cc209e47ed4e and timestamp: 2021-10-03T06:49:30.085000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.325

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2021 6:49:30 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 38.779 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/wsd6paxz33bg2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2498

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2498/display/redirect>

Changes:


------------------------------------------
[...truncated 339.30 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 3d8d79e60883a9ead96d04050fca6504
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Oct 03, 2021 12:44:55 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Oct 03, 2021 12:44:56 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 03, 2021 12:44:56 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Oct 03, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@850165773]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@793252937]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 03, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 03, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 03, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Oct 03, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 03, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 03, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Oct 03, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test924415722746679332.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test--TrSaHjGfQDLqi9c4923o6CG9tiTeH0tOQo81B3STRw.jar
    Oct 03, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Oct 03, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 03, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104891 bytes, hash 2378d85467543a1a3b5e0d3948b43275ba3f975a1b25eef73fbda532a3bf4a17> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-I3jYVGdUOho7Xg05SLQydbo_l1obJe73P72lMqO_Shc.pb
    Oct 03, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 03, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Oct 03, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Oct 03, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Oct 03, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-02_17_45_08-847305522168178165?project=apache-beam-testing
    Oct 03, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-10-02_17_45_08-847305522168178165
    Oct 03, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-02_17_45_08-847305522168178165
    Oct 03, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-10-03T00:45:14.054Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 03, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:45:19.984Z: Worker configuration: e2-standard-2 in us-central1-c.
    Oct 03, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:45:20.834Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 03, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:45:20.876Z: Expanding GroupByKey operations into optimizable parts.
    Oct 03, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:45:20.911Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 03, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:45:20.984Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 03, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:45:21.010Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 03, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:45:21.041Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Oct 03, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:45:21.390Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 03, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:45:21.446Z: Starting 5 workers in us-central1-c...
    Oct 03, 2021 12:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:45:34.512Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 03, 2021 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:46:05.764Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 03, 2021 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:46:31.377Z: Workers have started successfully.
    Oct 03, 2021 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:46:31.412Z: Workers have started successfully.
    Oct 03, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:46:59.267Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 03, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:46:59.464Z: Cleaning up.
    Oct 03, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:46:59.538Z: Stopping worker pool...
    Oct 03, 2021 12:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:49:24.875Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 03, 2021 12:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-03T00:49:24.919Z: Worker pool stopped.
    Oct 03, 2021 12:49:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-10-02_17_45_08-847305522168178165 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 76ceceec-88f9-4777-8b6c-bd8bf58d3eeb and timestamp: 2021-10-03T00:49:32.184000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.986

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 03, 2021 12:49:32 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.021 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 40.668 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 14s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/koe2y4us665ni

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2497

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2497/display/redirect>

Changes:


------------------------------------------
[...truncated 338.04 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 3d8d79e60883a9ead96d04050fca6504
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Oct 02, 2021 6:44:51 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Oct 02, 2021 6:44:51 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 02, 2021 6:44:52 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Oct 02, 2021 6:44:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 6:44:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 6:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 6:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@70049737]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 02, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@138522610]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 02, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Oct 02, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 02, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 02, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Oct 02, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2270684606588946108.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-aI3GlyaS01dmmjHAWA8z2L4kT-2mYbi05Qb1Pzn6Bas.jar
    Oct 02, 2021 6:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Oct 02, 2021 6:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 02, 2021 6:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104890 bytes, hash 7e76b48122056d8d57589b41fd006d6f26ee28106fa96c9e64e7d6cf4024fe99> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-fna0gSIFbY1XWJtB_QBtbybuKBBvqWyeZOfWz0Ak_pk.pb
    Oct 02, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 02, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Oct 02, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Oct 02, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Oct 02, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-02_11_45_03-10805159815140293046?project=apache-beam-testing
    Oct 02, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-10-02_11_45_03-10805159815140293046
    Oct 02, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-02_11_45_03-10805159815140293046
    Oct 02, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-10-02T18:45:07.210Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 02, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:45:12.122Z: Worker configuration: e2-standard-2 in us-central1-a.
    Oct 02, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:45:12.765Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 02, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:45:12.803Z: Expanding GroupByKey operations into optimizable parts.
    Oct 02, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:45:12.835Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 02, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:45:12.904Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 02, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:45:12.920Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 02, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:45:12.959Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Oct 02, 2021 6:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:45:13.277Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 02, 2021 6:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:45:13.348Z: Starting 5 workers in us-central1-a...
    Oct 02, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:45:43.222Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 02, 2021 6:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:45:59.456Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 02, 2021 6:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:46:24.817Z: Workers have started successfully.
    Oct 02, 2021 6:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:46:24.842Z: Workers have started successfully.
    Oct 02, 2021 6:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:46:51.107Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 02, 2021 6:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:46:51.231Z: Cleaning up.
    Oct 02, 2021 6:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:46:51.302Z: Stopping worker pool...
    Oct 02, 2021 6:49:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:49:16.327Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 02, 2021 6:49:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T18:49:16.366Z: Worker pool stopped.
    Oct 02, 2021 6:49:24 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-10-02_11_45_03-10805159815140293046 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 50b2eac8-5528-42f0-8769-ddffd4ba8580 and timestamp: 2021-10-02T18:49:24.414000000Z:
                     Metric:                    Value:
                   read_time                     6.322
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2021 6:49:24 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 37.51 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 7s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/opgofo7adrz4o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2496

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2496/display/redirect>

Changes:


------------------------------------------
[...truncated 339.09 KB...]

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 3d8d79e60883a9ead96d04050fca6504
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Oct 02, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Oct 02, 2021 12:44:54 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 02, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Oct 02, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@850165773]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 02, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Oct 02, 2021 12:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 02, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 02, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Oct 02, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test33187000213534086.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-s0Ua8XEPsFQPPuOm96uiigzcXG-39oKQ9e8AXeUrHS8.jar
    Oct 02, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.69/6cb063f1d527ff65bdbb9ea74888a5ffc3f92197/fastjson-1.2.69.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.69-KniRdEoDeOAJmpuB3b7U0uGn6GMvb4b4_8-jg-UeScg.jar
    Oct 02, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 248 files cached, 2 files newly uploaded in 0 seconds
    Oct 02, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 02, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104890 bytes, hash 0dc756c4b389ce0fa11003af05bca7f297ddf24ca271270ad5807b7d2e746487> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-DcdWxLOJzg-hEAOvBbyn8pfd8kyicScK1YB7fS50ZIc.pb
    Oct 02, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 02, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Oct 02, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Oct 02, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Oct 02, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-02_05_45_07-11784265171889707970?project=apache-beam-testing
    Oct 02, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-10-02_05_45_07-11784265171889707970
    Oct 02, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-02_05_45_07-11784265171889707970
    Oct 02, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-10-02T12:45:10.607Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:17.109Z: Worker configuration: e2-standard-2 in us-central1-c.
    Oct 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:17.936Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:17.968Z: Expanding GroupByKey operations into optimizable parts.
    Oct 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:18.004Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:18.081Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:18.107Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:18.144Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Oct 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:18.480Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:18.550Z: Starting 5 workers in us-central1-c...
    Oct 02, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:44.525Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 02, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:49.478Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Oct 02, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:49.509Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Oct 02, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:45:59.755Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 02, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:46:23.466Z: Workers have started successfully.
    Oct 02, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:46:23.500Z: Workers have started successfully.
    Oct 02, 2021 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:46:52.468Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 02, 2021 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:46:52.591Z: Cleaning up.
    Oct 02, 2021 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:46:52.665Z: Stopping worker pool...
    Oct 02, 2021 12:49:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:49:17.724Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 02, 2021 12:49:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T12:49:17.772Z: Worker pool stopped.
    Oct 02, 2021 12:49:24 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-10-02_05_45_07-11784265171889707970 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dedadd50-3e74-47e6-9654-b427973b8b6b and timestamp: 2021-10-02T12:49:24.131000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.222

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2021 12:49:24 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 34.872 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 5s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/yxcpdea4k4l5w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2495

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2495/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12513] Schemas and Coders (#15632)


------------------------------------------
[...truncated 337.54 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':',5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 3d8d79e60883a9ead96d04050fca6504
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Oct 02, 2021 6:44:50 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Oct 02, 2021 6:44:51 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 02, 2021 6:44:51 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Oct 02, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@613927329]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 02, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Oct 02, 2021 6:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 02, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 02, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Oct 02, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3291093574128526183.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JQLYC_B7axlOWx2YCYh9nveYzWLbXvACd1k_Cr3TckU.jar
    Oct 02, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Oct 02, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 02, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104890 bytes, hash cdf0481c711dd159373ac3b3bc44104c72ecebf31f065a5590122e6ffd5e8ed7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zfBIHHEd0Vk3OsOzvEQQTHLs6_MfBlpVkBIub_1ejtc.pb
    Oct 02, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 02, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Oct 02, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Oct 02, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Oct 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-01_23_45_03-2032330720690453059?project=apache-beam-testing
    Oct 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-10-01_23_45_03-2032330720690453059
    Oct 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-01_23_45_03-2032330720690453059
    Oct 02, 2021 6:45:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-10-02T06:45:06.938Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:45:13.276Z: Worker configuration: e2-standard-2 in us-central1-c.
    Oct 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:45:14.015Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:45:14.055Z: Expanding GroupByKey operations into optimizable parts.
    Oct 02, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:45:14.081Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 02, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:45:14.144Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 02, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:45:14.170Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 02, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:45:14.203Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Oct 02, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:45:14.506Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 02, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:45:14.572Z: Starting 5 workers in us-central1-c...
    Oct 02, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:45:37.970Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 02, 2021 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:45:49.961Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 02, 2021 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:46:24.150Z: Workers have started successfully.
    Oct 02, 2021 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:46:24.180Z: Workers have started successfully.
    Oct 02, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:46:51.961Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 02, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:46:52.111Z: Cleaning up.
    Oct 02, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:46:52.184Z: Stopping worker pool...
    Oct 02, 2021 6:49:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:49:12.021Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 02, 2021 6:49:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T06:49:12.123Z: Worker pool stopped.
    Oct 02, 2021 6:49:17 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-10-01_23_45_03-2032330720690453059 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 548c4b23-6edb-4267-9ef5-fe7adade5bf8 and timestamp: 2021-10-02T06:49:17.096000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.932

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2021 6:49:17 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 30.558 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 0s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/wqhy3kqscxoyc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2494

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2494/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11217] Implemented metrics filtering (#15482)


------------------------------------------
[...truncated 337.35 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 3d8d79e60883a9ead96d04050fca6504
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Oct 02, 2021 12:44:52 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Oct 02, 2021 12:44:53 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 02, 2021 12:44:53 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Oct 02, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@850165773]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 02, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Oct 02, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Oct 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3974640514448782353.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-7HGRObmaSGVFS8QV2P8thwn5FmC6eCrknOuT0Bf2cyk.jar
    Oct 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Oct 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104890 bytes, hash 2bd99464ec2b6ff3616614c2a789ecb41b73f4d396080cab872b45206c25591d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-K9mUZOwrb_NhZhTCp4nstBtz9NOWCAyrhytFIGwlWR0.pb
    Oct 02, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 02, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Oct 02, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Oct 02, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Oct 02, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-01_17_45_05-1229165175629021093?project=apache-beam-testing
    Oct 02, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-10-01_17_45_05-1229165175629021093
    Oct 02, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-01_17_45_05-1229165175629021093
    Oct 02, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-10-02T00:45:08.604Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 02, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:45:15.499Z: Worker configuration: e2-standard-2 in us-central1-c.
    Oct 02, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:45:16.183Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 02, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:45:16.225Z: Expanding GroupByKey operations into optimizable parts.
    Oct 02, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:45:16.259Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 02, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:45:16.334Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 02, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:45:16.357Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 02, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:45:16.381Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Oct 02, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:45:16.745Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 02, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:45:16.816Z: Starting 5 workers in us-central1-c...
    Oct 02, 2021 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:45:28.234Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 02, 2021 12:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:46:00.975Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 02, 2021 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:46:26.187Z: Workers have started successfully.
    Oct 02, 2021 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:46:26.212Z: Workers have started successfully.
    Oct 02, 2021 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:46:56.483Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 02, 2021 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:46:56.671Z: Cleaning up.
    Oct 02, 2021 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:46:56.794Z: Stopping worker pool...
    Oct 02, 2021 12:49:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:49:15.664Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 02, 2021 12:49:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-02T00:49:15.711Z: Worker pool stopped.
    Oct 02, 2021 12:49:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-10-01_17_45_05-1229165175629021093 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4f713dcc-a53e-44f9-bef7-5aaf4a26ab5a and timestamp: 2021-10-02T00:49:20.830000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.426

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 02, 2021 12:49:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 32.389 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 2s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/rmy5gwhhtwwa2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2493

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2493/display/redirect?page=changes>

Changes:

[noreply] Switching to innerText and removing alerts

[noreply] [BEAM-12993] Update to Debezium 1.7.0.Final (#15636)

[noreply] Minor: Fix `Iterable[SplitResultResidual]` type errors (#15634)


------------------------------------------
[...truncated 338.04 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 3d8d79e60883a9ead96d04050fca6504
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Oct 01, 2021 6:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Oct 01, 2021 6:45:00 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 01, 2021 6:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Oct 01, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@850165773]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 01, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 01, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Oct 01, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 01, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 01, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Oct 01, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4824795570540795047.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ZSZHzKZ7N_SQD-MZgfeZ7Ndjbohxq-Ok3LUbeo0IcVo.jar
    Oct 01, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 1 seconds
    Oct 01, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 01, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104890 bytes, hash b3c4d4458a3186eb4ce590de1e748a9f4e5d5f53cc93309e911437bc92446ff9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-s8TURYoxhutM5ZDeHnSKn05dX1PMkzCekRQ3vJJEb_k.pb
    Oct 01, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 01, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Oct 01, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Oct 01, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Oct 01, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-01_11_45_13-142456295206191151?project=apache-beam-testing
    Oct 01, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-10-01_11_45_13-142456295206191151
    Oct 01, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-01_11_45_13-142456295206191151
    Oct 01, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-10-01T18:45:17.088Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 01, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:45:23.150Z: Worker configuration: e2-standard-2 in us-central1-a.
    Oct 01, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:45:23.930Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 01, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:45:23.963Z: Expanding GroupByKey operations into optimizable parts.
    Oct 01, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:45:23.992Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 01, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:45:24.079Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 01, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:45:24.106Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 01, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:45:24.121Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Oct 01, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:45:24.452Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 01, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:45:24.525Z: Starting 5 workers in us-central1-a...
    Oct 01, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:45:29.389Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 01, 2021 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:46:06.164Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 01, 2021 6:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:46:34.645Z: Workers have started successfully.
    Oct 01, 2021 6:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:46:34.687Z: Workers have started successfully.
    Oct 01, 2021 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:47:01.442Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 01, 2021 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:47:01.597Z: Cleaning up.
    Oct 01, 2021 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:47:01.670Z: Stopping worker pool...
    Oct 01, 2021 6:49:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:49:23.526Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 01, 2021 6:49:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T18:49:23.568Z: Worker pool stopped.
    Oct 01, 2021 6:49:29 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-10-01_11_45_13-142456295206191151 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 64c56dab-ce87-4057-b12d-fc9473e5b8f2 and timestamp: 2021-10-01T18:49:29.905000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.038

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2021 6:49:30 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 34.491 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 7s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/rvugmzkytt6qg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2492

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2492/display/redirect>

Changes:


------------------------------------------
[...truncated 339.39 KB...]
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 3d8d79e60883a9ead96d04050fca6504
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Oct 01, 2021 12:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Oct 01, 2021 12:44:57 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 01, 2021 12:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Oct 01, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@850165773]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@793252937]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 01, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Oct 01, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 01, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 01, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Oct 01, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4723206100083085531.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qEpWi-6a7u1_PK_4qWoBtbdwxjaeiG6zedVyYTv0MG0.jar
    Oct 01, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_26_0/0.1/5fae4e97a2d8739462bd1572e48d01228766b6ef/beam-vendor-calcite-1_26_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_26_0-0.1-pYZ7esxRWyhKmBqBdfrpnxvg8woyykTvGbaCvLtyRyA.jar
    Oct 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 248 files cached, 2 files newly uploaded in 1 seconds
    Oct 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104890 bytes, hash 33ad4cc501abdb54fba5f37c03d455d33c6347eb4a1246cc2efe46128fcb61af> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-M61MxQGr21T7pfN8A9RV0zxjR-tKEkbMLv5GEo_LYa8.pb
    Oct 01, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 01, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Oct 01, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Oct 01, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Oct 01, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-01_05_45_11-5449097116956767890?project=apache-beam-testing
    Oct 01, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-10-01_05_45_11-5449097116956767890
    Oct 01, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-10-01_05_45_11-5449097116956767890
    Oct 01, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-10-01T12:45:16.076Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 01, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:45:24.067Z: Worker configuration: e2-standard-2 in us-central1-c.
    Oct 01, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:45:24.891Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 01, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:45:24.937Z: Expanding GroupByKey operations into optimizable parts.
    Oct 01, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:45:24.972Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 01, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:45:25.073Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 01, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:45:25.099Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 01, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:45:25.140Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Oct 01, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:45:25.644Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 01, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:45:25.756Z: Starting 5 workers in us-central1-c...
    Oct 01, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:45:48.950Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 01, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:46:10.070Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 01, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:46:34.013Z: Workers have started successfully.
    Oct 01, 2021 12:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:46:34.044Z: Workers have started successfully.
    Oct 01, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:47:03.394Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 01, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:47:03.554Z: Cleaning up.
    Oct 01, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:47:03.634Z: Stopping worker pool...
    Oct 01, 2021 12:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:49:20.174Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 01, 2021 12:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T12:49:20.237Z: Worker pool stopped.
    Oct 01, 2021 12:49:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-10-01_05_45_11-5449097116956767890 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d0ab1099-69b9-40ed-9191-e65610971fe5 and timestamp: 2021-10-01T12:49:27.722000000Z:
                     Metric:                    Value:
                   read_time                     9.199
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2021 12:49:28 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 34.786 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 8s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/5jh474kjyd6vq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2491

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2491/display/redirect>

Changes:


------------------------------------------
[...truncated 337.78 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':',5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 3d8d79e60883a9ead96d04050fca6504
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Oct 01, 2021 6:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Oct 01, 2021 6:44:58 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 01, 2021 6:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Oct 01, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@613927329]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 01, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@688000764]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 01, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Oct 01, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 01, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 01, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Oct 01, 2021 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3299828175081092784.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-7FktRZ62Gz2eHphKeQzeql6S6amjKEg1_JupBewy7bU.jar
    Oct 01, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 1 seconds
    Oct 01, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 01, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104890 bytes, hash 44c58d35db3a08d70e83108a6ce29ae09f8ab7d0e8180073f4eb362973880439> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-RMWNNds6CNcOgxCKbOKa4J-Kt9DoGABz9Os2KXOIBDk.pb
    Oct 01, 2021 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 01, 2021 6:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Oct 01, 2021 6:45:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Oct 01, 2021 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Oct 01, 2021 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-30_23_45_17-15965796272908244087?project=apache-beam-testing
    Oct 01, 2021 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-30_23_45_17-15965796272908244087
    Oct 01, 2021 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-30_23_45_17-15965796272908244087
    Oct 01, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-10-01T06:45:20.619Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 01, 2021 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:45:27.388Z: Worker configuration: e2-standard-2 in us-central1-c.
    Oct 01, 2021 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:45:28.080Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 01, 2021 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:45:28.121Z: Expanding GroupByKey operations into optimizable parts.
    Oct 01, 2021 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:45:28.157Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 01, 2021 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:45:28.232Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 01, 2021 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:45:28.266Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 01, 2021 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:45:28.299Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Oct 01, 2021 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:45:28.979Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 01, 2021 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:45:29.066Z: Starting 5 workers in us-central1-c...
    Oct 01, 2021 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:45:56.913Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 01, 2021 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:46:12.087Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 01, 2021 6:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:46:39.907Z: Workers have started successfully.
    Oct 01, 2021 6:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:46:39.944Z: Workers have started successfully.
    Oct 01, 2021 6:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:47:11.239Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 01, 2021 6:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:47:11.390Z: Cleaning up.
    Oct 01, 2021 6:47:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:47:11.481Z: Stopping worker pool...
    Oct 01, 2021 6:49:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:49:38.376Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 01, 2021 6:49:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T06:49:38.417Z: Worker pool stopped.
    Oct 01, 2021 6:49:46 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-30_23_45_17-15965796272908244087 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f683088c-36be-4160-b47e-ac6bcabbe3bc and timestamp: 2021-10-01T06:49:46.231000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.766

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2021 6:49:46 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 53.614 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 26s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/kbjuzj36jtjba

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2490

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2490/display/redirect?page=changes>

Changes:

[dpcollins] [BEAM-12908] Change to use PubsubSignal for information propagation so

[Robert Bradshaw] Preserve more types in transform replacement.

[Robert Bradshaw] Update test to reflect preserved type hint.

[noreply] [BEAM-11985] Python Bigtable - Implement IO Request Count metrics

[noreply] [BEAM-9918] Make TryCrossLanguage match non Try API (#15633)

[noreply] [BEAM-12957] Add support for pyarrow 5.x (#15588)


------------------------------------------
[...truncated 341.46 KB...]
> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 3d8d79e60883a9ead96d04050fca6504
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Oct 01, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Oct 01, 2021 12:45:10 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 01, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Oct 01, 2021 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 12:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 12:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@613927329]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Oct 01, 2021 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2021 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Oct 01, 2021 12:45:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 12:45:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2021 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Oct 01, 2021 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Oct 01, 2021 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Oct 01, 2021 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Oct 01, 2021 12:45:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Oct 01, 2021 12:45:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Oct 01, 2021 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Oct 01, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Oct 01, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Oct 01, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-qOxdI4KUjo3GTP9rF7KeqZGuhs9XeyH-VdOKR6HUq6Q.jar
    Oct 01, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1429108089442257848.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-w7MG2wlG_XkuF6bjtEgQNkbrEnEKiP3pFRtFqP2J6pE.jar
    Oct 01, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 248 files cached, 2 files newly uploaded in 0 seconds
    Oct 01, 2021 12:45:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Oct 01, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104891 bytes, hash 0184a43e3e51cd74e76d0438af8bb9c64cb96b68908e74269f492224778065c7> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-AYSkPj5RzXTnbQQ4r4u5xky5a2iQjnQmn0kiJHeAZcc.pb
    Oct 01, 2021 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Oct 01, 2021 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Oct 01, 2021 12:45:22 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Oct 01, 2021 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Oct 01, 2021 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-30_17_45_23-9121831774679467180?project=apache-beam-testing
    Oct 01, 2021 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-30_17_45_23-9121831774679467180
    Oct 01, 2021 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-30_17_45_23-9121831774679467180
    Oct 01, 2021 12:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-10-01T00:45:27.377Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Oct 01, 2021 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:45:33.914Z: Worker configuration: e2-standard-2 in us-central1-a.
    Oct 01, 2021 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:45:34.799Z: Expanding CoGroupByKey operations into optimizable parts.
    Oct 01, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:45:34.840Z: Expanding GroupByKey operations into optimizable parts.
    Oct 01, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:45:34.877Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Oct 01, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:45:34.975Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Oct 01, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:45:35.013Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Oct 01, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:45:35.032Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Oct 01, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:45:35.349Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 01, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:45:35.429Z: Starting 5 workers in us-central1-a...
    Oct 01, 2021 12:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:45:57.313Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Oct 01, 2021 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:46:16.336Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Oct 01, 2021 12:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:46:16.362Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Oct 01, 2021 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:46:26.720Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Oct 01, 2021 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:46:49.531Z: Workers have started successfully.
    Oct 01, 2021 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:46:49.548Z: Workers have started successfully.
    Oct 01, 2021 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:47:17.957Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Oct 01, 2021 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:47:18.091Z: Cleaning up.
    Oct 01, 2021 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:47:18.173Z: Stopping worker pool...
    Oct 01, 2021 12:49:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:49:52.733Z: Autoscaling: Resized worker pool from 5 to 0.
    Oct 01, 2021 12:49:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-10-01T00:49:52.763Z: Worker pool stopped.
    Oct 01, 2021 12:49:58 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-30_17_45_23-9121831774679467180 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 47d31f15-03a8-4eb1-bea7-de151ad3e716 and timestamp: 2021-10-01T00:49:59.046000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.271

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Oct 01, 2021 12:49:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 53.151 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 40s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/ltxf7omg5btf6

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2489

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2489/display/redirect?page=changes>

Changes:

[stranniknm] [BEAM-12951]: Create initial structure for Playground application

[rohde.samuel] Fix BEAM-12984

[noreply] Update Beam glossary (#15619)


------------------------------------------
[...truncated 344.31 KB...]
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 30, 2021 6:46:59 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 30, 2021 6:47:01 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 30, 2021 6:47:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 6:47:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:47:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 6:47:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 6:47:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:47:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 6:47:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@415755770]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@991913540]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 6:47:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2021 6:47:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 30, 2021 6:47:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2021 6:47:29 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2021 6:47:29 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Sep 30, 2021 6:47:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7139866847520102503.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-sUITyF-jIcOE0DPCp5g8ar_3Wzik35VTFkPaHmK9Nik.jar
    Sep 30, 2021 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.15.1/ccd374ca7e5d8b06fb31ecffeb03abc42feada84/kafka-1.15.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.15.1-Z4qZ3jdWXDvdB7xD9VivkdVQiRZWNWUhPSIANVCSZDA.jar
    Sep 30, 2021 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.15.1/91e6dfab8f141f77c6a0dd147a94bd186993a22c/testcontainers-1.15.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.15.1-_fMrLpRXSowMnAISRc73bMk5UqrXNnsv6vkPXOZRXu0.jar
    Sep 30, 2021 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Sep 30, 2021 6:47:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-api/3.2.7/81408fc988c229ea11354fee9902c47842343f04/docker-java-api-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-api-3.2.7-AwtXDacySf3gUHoixVjjNqW0wUI0YcGJjMO-N7kxBho.jar
    Sep 30, 2021 6:47:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Sep 30, 2021 6:47:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Sep 30, 2021 6:47:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Sep 30, 2021 6:47:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Sep 30, 2021 6:47:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport-zerodep/3.2.7/d74cb05fabc57e9ec441dc39d0bc58ad649fad3d/docker-java-transport-zerodep-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-zerodep-3.2.7-O2a9rUb-899yDZMu4E8j9ItFOkNoggqbLP2xaLmOwJY.jar
    Sep 30, 2021 6:47:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport/3.2.7/315903a129f530422747efc163dd255f0fa2555e/docker-java-transport-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-3.2.7-ynNXlSXhDSyiLCXkA4c9E9_HGeKB7-wbpEzmswFrI-A.jar
    Sep 30, 2021 6:47:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 239 files cached, 11 files newly uploaded in 6 seconds
    Sep 30, 2021 6:47:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2021 6:47:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104890 bytes, hash c9aef7afd66ec0f6fc4129cd0a16db636b491f19f7c80ca12c3146d9d700016e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ya73r9ZuwPb8QSnNChbbY2tJHxn3yAyhLDFG2dcAAW4.pb
    Sep 30, 2021 6:47:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2021 6:47:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 30, 2021 6:47:43 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 30, 2021 6:47:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 30, 2021 6:47:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-30_11_47_44-14843811000132432781?project=apache-beam-testing
    Sep 30, 2021 6:47:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-30_11_47_44-14843811000132432781
    Sep 30, 2021 6:47:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-30_11_47_44-14843811000132432781
    Sep 30, 2021 6:47:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-30T18:47:50.444Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2021 6:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:47:59.801Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 30, 2021 6:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:00.631Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2021 6:48:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:00.678Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2021 6:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:00.711Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2021 6:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:00.775Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2021 6:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:00.823Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2021 6:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:00.856Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 30, 2021 6:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:01.271Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 30, 2021 6:48:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:01.351Z: Starting 5 workers in us-central1-c...
    Sep 30, 2021 6:48:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:28.732Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2021 6:48:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:37.174Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 30, 2021 6:48:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:37.205Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 30, 2021 6:48:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:48:47.443Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2021 6:49:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:49:11.591Z: Workers have started successfully.
    Sep 30, 2021 6:49:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:49:11.619Z: Workers have started successfully.
    Sep 30, 2021 6:49:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:49:41.139Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 30, 2021 6:49:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:49:41.266Z: Cleaning up.
    Sep 30, 2021 6:49:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:49:41.345Z: Stopping worker pool...
    Sep 30, 2021 6:52:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:52:03.674Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2021 6:52:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T18:52:03.851Z: Worker pool stopped.
    Sep 30, 2021 6:52:12 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-30_11_47_44-14843811000132432781 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e7dd9959-3546-470f-b447-57a5bb72fad4 and timestamp: 2021-09-30T18:52:12.078000000Z:
                     Metric:                    Value:
                   read_time                      9.98
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2021 6:52:12 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.058 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.075 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 5 mins 30.206 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 19s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/mny4r4enhidhy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2488

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2488/display/redirect>

Changes:


------------------------------------------
[...truncated 339.34 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 42de1e1f803ec917230cdec58ad48498
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 30, 2021 12:44:51 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 30, 2021 12:44:52 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 30, 2021 12:44:52 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 30, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@613927329]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 30, 2021 12:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Sep 30, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test888665778287048086.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-CFgx9LkNDXWIzF4L_Q8WWHAUnGvAvMK_daSGkpVuM8k.jar
    Sep 30, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 30, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104892 bytes, hash b4bf1acc6157a06cd79cecbba5e2af6c3f56cc71eb97ad21cbe6128e77c69dfb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tL8azGFXoGzXnOy7peKvbD9WzHHrl60hy-YSjnfGnfs.pb
    Sep 30, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 30, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 30, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 30, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-30_05_45_04-8137050605073411640?project=apache-beam-testing
    Sep 30, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-30_05_45_04-8137050605073411640
    Sep 30, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-30_05_45_04-8137050605073411640
    Sep 30, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-30T12:45:07.611Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:45:13.604Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:45:14.437Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:45:14.478Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:45:14.509Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:45:14.587Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:45:14.613Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:45:14.646Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:45:14.974Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:45:15.050Z: Starting 5 workers in us-central1-a...
    Sep 30, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:45:19.793Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:46:01.997Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:46:30.848Z: Workers have started successfully.
    Sep 30, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:46:30.871Z: Workers have started successfully.
    Sep 30, 2021 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:47:02.871Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 30, 2021 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:47:02.997Z: Cleaning up.
    Sep 30, 2021 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:47:03.090Z: Stopping worker pool...
    Sep 30, 2021 12:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:49:35.992Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2021 12:49:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T12:49:36.074Z: Worker pool stopped.
    Sep 30, 2021 12:49:43 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-30_05_45_04-8137050605073411640 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 972d64f7-d3b2-4841-8f57-1bb2826145c6 and timestamp: 2021-09-30T12:49:43.636000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.914

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2021 12:49:44 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 56.406 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 26s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/eaexy6eq2bbxy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2487

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2487/display/redirect>

Changes:


------------------------------------------
[...truncated 337.79 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 42de1e1f803ec917230cdec58ad48498
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 30, 2021 6:44:52 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 30, 2021 6:44:52 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 30, 2021 6:44:53 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 30, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@613927329]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 30, 2021 6:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Sep 30, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2790166041164843000.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-mTB5mjFKQujtR6LcWttiPQ9brPwnAPx19bogbCgUijI.jar
    Sep 30, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 30, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104893 bytes, hash 8b044eaaacaa7e7f8aac2783740cb80a1c2e999f4b8ff427738710ecfb472750> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-iwROqqyqfn-KrCeDdAy4ChwumZ9Lj_Qnc4cQ7PtHJ1A.pb
    Sep 30, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 30, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 30, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 30, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-29_23_45_04-480439106660977662?project=apache-beam-testing
    Sep 30, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-29_23_45_04-480439106660977662
    Sep 30, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-29_23_45_04-480439106660977662
    Sep 30, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-30T06:45:08.114Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:45:13.883Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 30, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:45:14.778Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:45:14.820Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:45:14.844Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:45:14.893Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:45:14.911Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:45:14.932Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 30, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:45:15.268Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 30, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:45:15.337Z: Starting 5 workers in us-central1-a...
    Sep 30, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:45:36.225Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2021 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:46:00.035Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2021 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:46:26.000Z: Workers have started successfully.
    Sep 30, 2021 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:46:26.033Z: Workers have started successfully.
    Sep 30, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:46:53.177Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 30, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:46:53.319Z: Cleaning up.
    Sep 30, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:46:53.400Z: Stopping worker pool...
    Sep 30, 2021 6:49:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:49:23.738Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2021 6:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T06:49:23.783Z: Worker pool stopped.
    Sep 30, 2021 6:49:30 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-29_23_45_04-480439106660977662 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): cacf32ed-da08-4bcc-a773-ddf9035f0ba3 and timestamp: 2021-09-30T06:49:30.851000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.082

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2021 6:49:31 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 42.826 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 12s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/xelkpgsh6jara

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2486

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2486/display/redirect?page=changes>

Changes:

[kileysok] Update windmill state map to resolve *IfAbsent methods immediately

[piotr.szczepanik] [BEAM-12356] Fixed last non-cached usage of DatasetService in BigQuery

[noreply] [BEAM-12977] Translates Reshuffle in Portable Mode with Samza native

[noreply] [BEAM-12982] Help users debug which JvmInitializer is running and when.


------------------------------------------
[...truncated 344.50 KB...]
> Task :sdks:java:extensions:sql:perf-tests:testClasses
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) started.
Gradle Test Executor 3 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 42de1e1f803ec917230cdec58ad48498
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 3'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 3'
Successfully started process 'Gradle Test Executor 3'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 30, 2021 12:46:03 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 30, 2021 12:46:04 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 30, 2021 12:46:05 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 30, 2021 12:46:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 12:46:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:46:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 12:46:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 12:46:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 12:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1413020570]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 30, 2021 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2021 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 30, 2021 12:46:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:46:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 12:46:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@210017548]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2021 12:46:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 12:46:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:46:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 30, 2021 12:46:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 30, 2021 12:46:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 30, 2021 12:46:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 30, 2021 12:46:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 30, 2021 12:46:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 30, 2021 12:46:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 30, 2021 12:46:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 30, 2021 12:46:14 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-d2T2Ww3PkGIpGazPxOGPmNWYNJy49hfJckZ3R5mrrWo.jar
    Sep 30, 2021 12:46:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test573253334719175251.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lVVrtBnXYSgG1a2yKE-olbo9sFY3aO8mu-UE6wX2Qas.jar
    Sep 30, 2021 12:46:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 30, 2021 12:46:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 30, 2021 12:46:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104890 bytes, hash 2d97c0ed6cfde58fd8348c7ced6693d03bd8fa5d1dde615f54f569d976923356> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LZfA7Wz95Y_YNIx87WaT0DvY-l0d3mFfVPVp2XaSM1Y.pb
    Sep 30, 2021 12:46:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 30, 2021 12:46:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 30, 2021 12:46:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 30, 2021 12:46:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 30, 2021 12:46:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-29_17_46_17-1488063317250134391?project=apache-beam-testing
    Sep 30, 2021 12:46:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-29_17_46_17-1488063317250134391
    Sep 30, 2021 12:46:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-29_17_46_17-1488063317250134391
    Sep 30, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-30T00:46:22.320Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 30, 2021 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:46:27.775Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 30, 2021 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:46:28.466Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 30, 2021 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:46:28.515Z: Expanding GroupByKey operations into optimizable parts.
    Sep 30, 2021 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:46:28.560Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 30, 2021 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:46:28.637Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 30, 2021 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:46:28.666Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 30, 2021 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:46:28.712Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 30, 2021 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:46:29.230Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 30, 2021 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:46:29.305Z: Starting 5 workers in us-central1-a...
    Sep 30, 2021 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:46:42.628Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 30, 2021 12:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:47:20.081Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 30, 2021 12:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:47:45.242Z: Workers have started successfully.
    Sep 30, 2021 12:47:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:47:45.269Z: Workers have started successfully.
    Sep 30, 2021 12:48:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:48:16.697Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 30, 2021 12:48:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:48:16.897Z: Cleaning up.
    Sep 30, 2021 12:48:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:48:16.985Z: Stopping worker pool...
    Sep 30, 2021 12:50:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:50:36.616Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 30, 2021 12:50:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-30T00:50:36.663Z: Worker pool stopped.
    Sep 30, 2021 12:50:41 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-29_17_46_17-1488063317250134391 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c877a94b-b4d5-42f1-aa89-e2bff151cd7b and timestamp: 2021-09-30T00:50:41.950000000Z:
                     Metric:                    Value:
                   read_time                     9.085
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 30, 2021 12:50:42 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 42.08 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 24s
152 actionable tasks: 102 executed, 50 from cache

Publishing build scan...
https://gradle.com/s/3fjnuyutmrfsg

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2485

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2485/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-12942] Validate pubsub messages before they are published in

[clairem] [BEAM-12628] make useReflectApi default to true

[noreply] [BEAM-12856] Change hard-coded limits for reading from a UnboundedReader


------------------------------------------
[...truncated 359.71 KB...]
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 572a460eb19ebb7d6397f0db1a677cfc
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 11'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 11'
Successfully started process 'Gradle Test Executor 11'

Gradle Test Executor 11 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 29, 2021 6:58:14 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 29, 2021 6:58:16 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 29, 2021 6:58:17 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 29, 2021 6:58:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 6:58:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:58:54 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 6:58:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 6:58:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:58:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 6:58:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@769449421]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2021 6:59:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2021 6:59:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:59:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 6:59:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2021 6:59:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:59:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 6:59:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1710747748]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2021 6:59:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 6:59:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:59:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 6:59:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 6:59:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:59:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 6:59:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2021 6:59:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 29, 2021 6:59:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2021 6:59:47 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2021 6:59:47 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-JPP4cyCHQxn010diAGAefPjsl8-1s3Nta1xm7CweWOs.jar
    Sep 29, 2021 6:59:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2195400849734178326.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-E1qygH0ujm0NqN53mzUn7Qw1X8k6RhcpwSlfU35zayc.jar
    Sep 29, 2021 6:59:48 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 1 seconds
    Sep 29, 2021 6:59:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2021 6:59:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104889 bytes, hash a7f8094b559bc23ce8d5449d7564caa2e829b036fd58bdecd261207045f02f70> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-p_gJS1Wbwjzo1USddWTKougpsDb9WL3s0mEgcEXwL3A.pb
    Sep 29, 2021 6:59:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2021 6:59:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 29, 2021 6:59:51 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 29, 2021 6:59:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 29, 2021 6:59:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-29_11_59_51-1748228159735481124?project=apache-beam-testing
    Sep 29, 2021 6:59:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-29_11_59_51-1748228159735481124
    Sep 29, 2021 6:59:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-29_11_59_51-1748228159735481124
    Sep 29, 2021 6:59:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-29T18:59:55.000Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2021 7:01:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:01:28.347Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2021 7:01:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:01:31.107Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 29, 2021 7:01:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:01:31.933Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2021 7:01:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:01:32.070Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2021 7:01:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:01:32.134Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2021 7:01:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:01:32.275Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2021 7:01:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:01:32.333Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2021 7:01:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:01:32.400Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 29, 2021 7:01:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:01:33.863Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 29, 2021 7:01:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:01:34.065Z: Starting 5 workers in us-central1-c...
    Sep 29, 2021 7:02:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:02:08.491Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 29, 2021 7:02:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:02:08.561Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 29, 2021 7:02:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:02:18.897Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2021 7:02:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:02:58.535Z: Workers have started successfully.
    Sep 29, 2021 7:02:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:02:58.591Z: Workers have started successfully.
    Sep 29, 2021 7:03:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:03:30.747Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 29, 2021 7:03:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:03:31.457Z: Cleaning up.
    Sep 29, 2021 7:03:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:03:32.659Z: Stopping worker pool...
    Sep 29, 2021 7:05:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:05:46.159Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2021 7:05:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T19:05:46.340Z: Worker pool stopped.
    Sep 29, 2021 7:06:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-29_11_59_51-1748228159735481124 finished with status DONE.


Gradle Test Executor 11 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 61fc3acd-7f0a-4563-a089-8fbdfdf37816 and timestamp: 2021-09-29T19:06:45.080000000Z:
                     Metric:                    Value:
                   read_time                     7.666
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2021 7:06:45 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

3 tests completed, 2 failed
Finished generating test XML results (0.011 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.005 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 8 mins 47.008 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 21m 23s
152 actionable tasks: 109 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/7d7qoff3k77au

Stopped 4 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2484

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2484/display/redirect>

Changes:


------------------------------------------
[...truncated 337.20 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 8fac3e85e47cb465aca8eadd097ae6c8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 29, 2021 12:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 29, 2021 12:44:59 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 29, 2021 12:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 29, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1505334907]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 29, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-mkwBNRU4-IrDjMZ0CMdMym0ZilYRrhqnvMLb7Axcm90.jar
    Sep 29, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test657470287928516511.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-HWEZBcNCPETt_MjilxZjiWsuusvE0FuEl60l020V96U.jar
    Sep 29, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 29, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104892 bytes, hash 7bd744d3e6cebdec48e398023fe98c13ec19c56050b1522239d987dbffb1a4b9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-e9dE0-bOvexI45gCP-mME-wZxWBQsVIiOdmH2_-xpLk.pb
    Sep 29, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 29, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 29, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 29, 2021 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-29_05_45_12-4445359840033513456?project=apache-beam-testing
    Sep 29, 2021 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-29_05_45_12-4445359840033513456
    Sep 29, 2021 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-29_05_45_12-4445359840033513456
    Sep 29, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-29T12:45:19.145Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:45:26.872Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 29, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:45:27.767Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:45:27.806Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:45:27.833Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:45:27.907Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:45:27.927Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:45:27.961Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 29, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:45:28.299Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 29, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:45:28.377Z: Starting 5 workers in us-central1-a...
    Sep 29, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:45:33.133Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2021 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:46:15.299Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2021 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:46:41.320Z: Workers have started successfully.
    Sep 29, 2021 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:46:41.353Z: Workers have started successfully.
    Sep 29, 2021 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:47:10.496Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 29, 2021 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:47:10.620Z: Cleaning up.
    Sep 29, 2021 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:47:10.706Z: Stopping worker pool...
    Sep 29, 2021 12:50:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:50:05.668Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2021 12:50:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T12:50:05.733Z: Worker pool stopped.
    Sep 29, 2021 12:50:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-29_05_45_12-4445359840033513456 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a473a4e6-b787-4486-bdbc-c44588f3e576 and timestamp: 2021-09-29T12:50:16.855000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.501

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2021 12:50:17 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.189 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 5 mins 22.696 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 58s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/nclbmw3pccass

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2483

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2483/display/redirect>

Changes:


------------------------------------------
[...truncated 345.98 KB...]
Gradle Test Executor 3 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 8fac3e85e47cb465aca8eadd097ae6c8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 3'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 3'
Successfully started process 'Gradle Test Executor 3'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 29, 2021 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 29, 2021 6:45:15 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 29, 2021 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 29, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:45:19 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 6:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 6:45:19 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 6:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2021 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2021 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1299075293]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 29, 2021 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-mkwBNRU4-IrDjMZ0CMdMym0ZilYRrhqnvMLb7Axcm90.jar
    Sep 29, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4767608592662490288.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SPSuFqghUI4aME98kWQbAu6dIX6hLa538WUDFVieR24.jar
    Sep 29, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 29, 2021 6:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104890 bytes, hash f06cde5785855ef35d79f1cb8dd910372587f22dd2ff2260569ab8751217abbc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-8GzeV4WFXvNdefHLjdkQNyWH8i3S_yJgVpq4dRIXq7w.pb
    Sep 29, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 29, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 29, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 29, 2021 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-28_23_45_28-6780201276708781993?project=apache-beam-testing
    Sep 29, 2021 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-28_23_45_28-6780201276708781993
    Sep 29, 2021 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-28_23_45_28-6780201276708781993
    Sep 29, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-29T06:45:31.977Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:45:40.972Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 29, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:45:41.663Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:45:41.703Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:45:41.730Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:45:41.807Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:45:41.838Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:45:41.867Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 29, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:45:42.247Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 29, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:45:42.316Z: Starting 5 workers in us-central1-c...
    Sep 29, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:45:50.742Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2021 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:46:17.907Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 29, 2021 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:46:17.945Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 29, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:46:28.179Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:46:55.358Z: Workers have started successfully.
    Sep 29, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:46:55.407Z: Workers have started successfully.
    Sep 29, 2021 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:47:26.008Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 29, 2021 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:47:26.193Z: Cleaning up.
    Sep 29, 2021 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:47:26.288Z: Stopping worker pool...
    Sep 29, 2021 6:49:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:49:48.018Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2021 6:49:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T06:49:48.079Z: Worker pool stopped.
    Sep 29, 2021 6:49:54 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-28_23_45_28-6780201276708781993 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 65a611b5-7c37-478c-b9ac-7cb85651a9ac and timestamp: 2021-09-29T06:49:54.463000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.573

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2021 6:49:54 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 43.423 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 37s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/y4mbdkxl577uk

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2482

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2482/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11129] Add namespace and key to portable display data (#15564)

[noreply] [BEAM-12906] Add a `dataframe` extra for installing a pandas version


------------------------------------------
[...truncated 366.26 KB...]
Gradle Test Executor 7 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 8fac3e85e47cb465aca8eadd097ae6c8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 7'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 7'
Successfully started process 'Gradle Test Executor 7'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 29, 2021 12:49:35 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 29, 2021 12:49:36 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 29, 2021 12:49:37 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 29, 2021 12:49:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 12:49:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:49:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 12:49:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 12:49:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:49:41 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 12:49:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1505334907]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 29, 2021 12:49:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2021 12:49:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:49:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 12:49:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 29, 2021 12:49:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:49:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 12:49:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2021 12:49:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 12:49:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:49:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 29, 2021 12:49:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 29, 2021 12:49:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 29, 2021 12:49:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 29, 2021 12:49:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 29, 2021 12:49:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 29, 2021 12:49:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 29, 2021 12:49:48 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 29, 2021 12:49:48 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-mkwBNRU4-IrDjMZ0CMdMym0ZilYRrhqnvMLb7Axcm90.jar
    Sep 29, 2021 12:49:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7083351408459195147.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-aJjNwohAk2chZZrWcDhxFA3fmi0rcx5ZcQijQ58Wfsw.jar
    Sep 29, 2021 12:49:49 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 29, 2021 12:49:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 29, 2021 12:49:49 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104891 bytes, hash c4c25d34dc802fe4e8a921f61e04eea39bf8537c0aa431f9465055ae43f49e96> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-xMJdNNyAL-ToqSH2HgTuo5v4U3wKpDH5RlBVrkP0npY.pb
    Sep 29, 2021 12:49:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 29, 2021 12:49:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 29, 2021 12:49:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 29, 2021 12:49:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 29, 2021 12:49:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-28_17_49_52-11142279470977090787?project=apache-beam-testing
    Sep 29, 2021 12:49:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-28_17_49_52-11142279470977090787
    Sep 29, 2021 12:49:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-28_17_49_52-11142279470977090787
    Sep 29, 2021 12:49:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-29T00:49:55.374Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 29, 2021 12:50:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:01.028Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 29, 2021 12:50:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:01.916Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 29, 2021 12:50:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:01.966Z: Expanding GroupByKey operations into optimizable parts.
    Sep 29, 2021 12:50:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:02.016Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 29, 2021 12:50:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:02.115Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 29, 2021 12:50:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:02.165Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 29, 2021 12:50:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:02.202Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 29, 2021 12:50:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:02.641Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 29, 2021 12:50:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:02.764Z: Starting 5 workers in us-central1-a...
    Sep 29, 2021 12:50:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:18.178Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 29, 2021 12:50:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:39.435Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 29, 2021 12:50:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:39.469Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 29, 2021 12:50:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:50:49.832Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 29, 2021 12:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:51:14.831Z: Workers have started successfully.
    Sep 29, 2021 12:51:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:51:14.871Z: Workers have started successfully.
    Sep 29, 2021 12:51:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:51:43.695Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 29, 2021 12:51:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:51:43.827Z: Cleaning up.
    Sep 29, 2021 12:51:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:51:43.894Z: Stopping worker pool...
    Sep 29, 2021 12:54:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:54:10.501Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 29, 2021 12:54:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-29T00:54:10.549Z: Worker pool stopped.
    Sep 29, 2021 12:54:16 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-28_17_49_52-11142279470977090787 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 927253f6-e483-4a48-876a-0686275b0456 and timestamp: 2021-09-29T00:54:16.280000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.284

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 29, 2021 12:54:16 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 7 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.042 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.052 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 46.545 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 52s
152 actionable tasks: 115 executed, 37 from cache

Publishing build scan...
https://gradle.com/s/gsmiwhly6qg4o

Stopped 6 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2481

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2481/display/redirect?page=changes>

Changes:

[chamikaramj] Sets the correct coder in 'ConstantTableDestinations' when using BQ

[Luke Cwik] [BEAM-12974] Migrate off of deprecated package for MockitoJUnitRunner

[noreply] [BEAM-12593] Verify DataFrame API on pandas 1.3 (with container update)

[noreply] [BEAM-12973] Print Go Test and Script info. (#15604)

[noreply] [BEAM-10913] - Installing and persisting Grafana plugin in kubernetes


------------------------------------------
[...truncated 341.27 KB...]
Gradle Test Executor 2 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 957a10a4d509128460557542c6f8bab1
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 28, 2021 6:52:32 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 28, 2021 6:52:33 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 28, 2021 6:52:33 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 28, 2021 6:52:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 6:52:36 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:52:37 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 6:52:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 6:52:37 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:52:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 6:52:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2021 6:52:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2021 6:52:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:52:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 6:52:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2021 6:52:38 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:52:38 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 6:52:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2021 6:52:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 6:52:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:52:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 6:52:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 6:52:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:52:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 6:52:39 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2021 6:52:39 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 28, 2021 6:52:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2021 6:52:43 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2021 6:52:43 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 28, 2021 6:52:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test489072803952744924.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-uQraVB0IQDQ2YOEU5iWVuptBhQeg6Zsso9-RMi6dsZM.jar
    Sep 28, 2021 6:52:44 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 28, 2021 6:52:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2021 6:52:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 228eacc7913c31784768f960c3e984e786c6ca5e5acdfaa1f0ea31ce84cc06eb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Io6sx5E8MXhHaPlgw-mE54bGyl5azfqh8OoxzoTMBus.pb
    Sep 28, 2021 6:52:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2021 6:52:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 28, 2021 6:52:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 28, 2021 6:52:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 28, 2021 6:52:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-28_11_52_46-17894663093463120524?project=apache-beam-testing
    Sep 28, 2021 6:52:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-28_11_52_46-17894663093463120524
    Sep 28, 2021 6:52:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-28_11_52_46-17894663093463120524
    Sep 28, 2021 6:52:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-28T18:52:50.593Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2021 6:52:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:52:58.642Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 28, 2021 6:52:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:52:59.411Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2021 6:52:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:52:59.547Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2021 6:52:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:52:59.595Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2021 6:52:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:52:59.752Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2021 6:53:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:52:59.832Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2021 6:53:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:52:59.892Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 28, 2021 6:53:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:53:00.634Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 28, 2021 6:53:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:53:00.777Z: Starting 5 workers in us-central1-c...
    Sep 28, 2021 6:53:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:53:14.245Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2021 6:53:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:53:33.812Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 28, 2021 6:53:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:53:33.843Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 28, 2021 6:53:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:53:44.087Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2021 6:54:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:54:10.136Z: Workers have started successfully.
    Sep 28, 2021 6:54:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:54:10.195Z: Workers have started successfully.
    Sep 28, 2021 6:54:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:54:49.884Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 28, 2021 6:54:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:54:50.184Z: Cleaning up.
    Sep 28, 2021 6:54:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:54:50.437Z: Stopping worker pool...
    Sep 28, 2021 6:57:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:57:09.461Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2021 6:57:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T18:57:09.527Z: Worker pool stopped.
    Sep 28, 2021 6:57:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-28_11_52_46-17894663093463120524 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2161815d-114f-4e1c-8fa6-f57c56f413fd and timestamp: 2021-09-28T18:57:15.777000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    17.656

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2021 6:57:16 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 47.341 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 36s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/lrmzmpon2amzc

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2480

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2480/display/redirect>

Changes:


------------------------------------------
[...truncated 339.67 KB...]
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is ad194d210d40413101015464770bfca6
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 28, 2021 12:44:55 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 28, 2021 12:44:56 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 28, 2021 12:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 28, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1505334907]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 28, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 28, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6972548575975067231.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-6qcaWwe3GKP3EJBBpRszHeC07YU6vdlHz0z8wMhayPc.jar
    Sep 28, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 28, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash e037fa70c507fa1cc6c773c71bec1353a5da935c79caefa00b24b7e6d8155b55> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4Df6cMUH-hzGx3PHG-wTU6Xak1x5yu-gCyS35tgVW1U.pb
    Sep 28, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 28, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 28, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 28, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-28_05_45_10-332958398704530790?project=apache-beam-testing
    Sep 28, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-28_05_45_10-332958398704530790
    Sep 28, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-28_05_45_10-332958398704530790
    Sep 28, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-28T12:45:13.593Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:22.446Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 28, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:23.184Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:23.212Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:23.248Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:23.336Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:23.359Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:23.383Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 28, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:23.681Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 28, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:23.752Z: Starting 5 workers in us-central1-c...
    Sep 28, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:56.297Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:58.106Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 28, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:45:58.151Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 28, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:46:08.368Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2021 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:46:31.538Z: Workers have started successfully.
    Sep 28, 2021 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:46:31.574Z: Workers have started successfully.
    Sep 28, 2021 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:47:12.136Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 28, 2021 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:47:12.300Z: Cleaning up.
    Sep 28, 2021 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:47:12.394Z: Stopping worker pool...
    Sep 28, 2021 12:49:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:49:31.463Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2021 12:49:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T12:49:31.522Z: Worker pool stopped.
    Sep 28, 2021 12:49:40 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-28_05_45_10-332958398704530790 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 07445f19-266b-4e41-bda7-1f1d29bf5cc3 and timestamp: 2021-09-28T12:49:41.001000000Z:
                     Metric:                    Value:
                   read_time                    17.122
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2021 12:49:41 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 50.158 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 22s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/dni5fye7dagum

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2479

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2479/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12611] Capture a greater proporition of logs associated to an


------------------------------------------
[...truncated 336.99 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is ad194d210d40413101015464770bfca6
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 28, 2021 6:44:56 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 28, 2021 6:44:57 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 28, 2021 6:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 28, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 28, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 28, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8826700596754350547.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-YGzE0tq0cHi5e33soYf-VTEG1yaLQCfKEhpRJW5tNqs.jar
    Sep 28, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 28, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 3ecb1a7f956a4f472e94d873bf27191d68a0e2769c96f1a74f3f42797f81c4b0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Pssaf5VqT0culNhzvycZHWig4naclvGnTz9CeX-BxLA.pb
    Sep 28, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 28, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 28, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 28, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-27_23_45_10-16000690830406534922?project=apache-beam-testing
    Sep 28, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-27_23_45_10-16000690830406534922
    Sep 28, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-27_23_45_10-16000690830406534922
    Sep 28, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-28T06:45:14.080Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:45:19.547Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 28, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:45:20.204Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:45:20.262Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:45:20.295Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:45:20.357Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:45:20.385Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:45:20.419Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 28, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:45:20.778Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 28, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:45:20.859Z: Starting 5 workers in us-central1-a...
    Sep 28, 2021 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:45:48.174Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2021 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:46:06.583Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2021 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:46:33.200Z: Workers have started successfully.
    Sep 28, 2021 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:46:33.233Z: Workers have started successfully.
    Sep 28, 2021 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:46:59.099Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 28, 2021 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:46:59.284Z: Cleaning up.
    Sep 28, 2021 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:46:59.356Z: Stopping worker pool...
    Sep 28, 2021 6:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:49:26.083Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2021 6:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T06:49:26.165Z: Worker pool stopped.
    Sep 28, 2021 6:49:31 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-27_23_45_10-16000690830406534922 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 73e6f619-a8a0-44a6-bc65-affb1c64ee5b and timestamp: 2021-09-28T06:49:31.629000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     6.836

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2021 6:49:32 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 39.841 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 12s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/mwtdigdmrz2bm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2478

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2478/display/redirect?page=changes>

Changes:

[Andrew Pilloud] [BEAM-12691] FieldAccessDescriptor for BeamCalcRel

[aydar.zaynutdinov] Init basic go backend structure

[aydar.zaynutdinov] Add .gitkeep file as an exclusion

[yileiyang] Remove encoding= from the json.loads call.

[noreply] [BEAM-12798] Add configurable combiner packing limit (#15391)

[noreply] [BEAM-12926] Translates Reshuffle with Samza native repartition operator

[noreply] [BEAM-11007] Allow nbconvert 6.x (#15595)

[noreply] [BEAM-12945] Fix crashes when importing the DataFrame API with pandas


------------------------------------------
[...truncated 340.88 KB...]
Gradle Test Executor 2 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is ad194d210d40413101015464770bfca6
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 28, 2021 12:46:08 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 28, 2021 12:46:08 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 28, 2021 12:46:09 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 28, 2021 12:46:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 12:46:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:46:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 12:46:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 12:46:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:46:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 12:46:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1505334907]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 28, 2021 12:46:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 28, 2021 12:46:15 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 28, 2021 12:46:15 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 28, 2021 12:46:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 28, 2021 12:46:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 28, 2021 12:46:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 28, 2021 12:46:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests-__o_cT2WEL8WknZtwTW8aQevksYt5UgSYWmasm174nQ.jar
    Sep 28, 2021 12:46:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3858642203605244807.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-LwrHOCGbgC2ue4bcblknD22fEXbokBbZBKLgLWHT-m4.jar
    Sep 28, 2021 12:46:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-ZgoREHN3SqOQclIGldjYWlJZYlnicfJ-w-LJBzxpP7U.jar
    Sep 28, 2021 12:46:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 3 files newly uploaded in 0 seconds
    Sep 28, 2021 12:46:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 28, 2021 12:46:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 56ec63180b31a16f47e66aca7425904f421affcf7986463dd5879a4f338982cc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-VuxjGAsxoW9H5mrKdCWQT0Ia_895hkY91YeaTzOJgsw.pb
    Sep 28, 2021 12:46:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 28, 2021 12:46:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 28, 2021 12:46:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 28, 2021 12:46:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 28, 2021 12:46:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-27_17_46_21-4927795710480121675?project=apache-beam-testing
    Sep 28, 2021 12:46:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-27_17_46_21-4927795710480121675
    Sep 28, 2021 12:46:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-27_17_46_21-4927795710480121675
    Sep 28, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-28T00:46:24.808Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 28, 2021 12:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:46:31.991Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 28, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:46:32.740Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 28, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:46:32.790Z: Expanding GroupByKey operations into optimizable parts.
    Sep 28, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:46:32.826Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 28, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:46:32.900Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 28, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:46:32.937Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 28, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:46:32.971Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 28, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:46:33.340Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 28, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:46:33.417Z: Starting 5 workers in us-central1-a...
    Sep 28, 2021 12:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:47:02.860Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 28, 2021 12:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:47:25.907Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 28, 2021 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:47:50.298Z: Workers have started successfully.
    Sep 28, 2021 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:47:50.326Z: Workers have started successfully.
    Sep 28, 2021 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:48:21.799Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 28, 2021 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:48:21.983Z: Cleaning up.
    Sep 28, 2021 12:48:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:48:22.120Z: Stopping worker pool...
    Sep 28, 2021 12:50:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:50:41.337Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 28, 2021 12:50:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-28T00:50:41.377Z: Worker pool stopped.
    Sep 28, 2021 12:50:48 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-27_17_46_21-4927795710480121675 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 76715fd9-b7c5-4c3e-86cb-1bbaf407eead and timestamp: 2021-09-28T00:50:48.165000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.473

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 28, 2021 12:50:48 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 44.083 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 31s
152 actionable tasks: 99 executed, 53 from cache

Publishing build scan...
https://gradle.com/s/oexuvqidchaq4

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2477

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2477/display/redirect?page=changes>

Changes:

[noreply] Update container tags used by unreleased SDKs.


------------------------------------------
[...truncated 338.48 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 83e0aceefc3fd4863bf7ab55c5815815
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 27, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 27, 2021 6:44:58 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 27, 2021 6:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 27, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 27, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 27, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8936137984933033682.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KvPdtXhCZeNI55k6veRk58SDRj4ngDPkdSqPZi2Vx4s.jar
    Sep 27, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 27, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 123b1db6c7af88c71d94372c202a15e7a05d02060ba54f5477693084babb7769> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EjsdtseviMcdlDcsICoV56BdAgYLpU9Ud2kwhLq7d2k.pb
    Sep 27, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 27, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 27, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 27, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-27_11_45_12-10521807972155577567?project=apache-beam-testing
    Sep 27, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-27_11_45_12-10521807972155577567
    Sep 27, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-27_11_45_12-10521807972155577567
    Sep 27, 2021 6:45:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-27T18:45:15.623Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:45:24.780Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:45:25.535Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:45:25.572Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:45:25.597Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:45:25.671Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:45:25.709Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:45:25.743Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:45:26.200Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:45:26.275Z: Starting 5 workers in us-central1-c...
    Sep 27, 2021 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:45:32.104Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2021 6:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:46:00.948Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 27, 2021 6:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:46:00.984Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 27, 2021 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:46:11.216Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2021 6:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:46:36.254Z: Workers have started successfully.
    Sep 27, 2021 6:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:46:36.286Z: Workers have started successfully.
    Sep 27, 2021 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:47:16.301Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 27, 2021 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:47:16.445Z: Cleaning up.
    Sep 27, 2021 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:47:16.590Z: Stopping worker pool...
    Sep 27, 2021 6:49:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:49:33.041Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2021 6:49:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T18:49:33.092Z: Worker pool stopped.
    Sep 27, 2021 6:49:42 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-27_11_45_12-10521807972155577567 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dda3b7d8-2d15-47c7-9de5-38cd1c09abee and timestamp: 2021-09-27T18:49:42.444000000Z:
                     Metric:                    Value:
                   read_time                    14.444
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2021 6:49:42 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 49.112 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 23s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/7k7nv6g4i2enk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2476

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2476/display/redirect>

Changes:


------------------------------------------
[...truncated 338.34 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 83e0aceefc3fd4863bf7ab55c5815815
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 27, 2021 12:44:55 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 27, 2021 12:44:56 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 27, 2021 12:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 27, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 27, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 27, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1698953046670420008.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-MW8u-3tPmTyH5yoyJtew4u-wYz3XrKQaqwvhvgeXgkE.jar
    Sep 27, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 2 seconds
    Sep 27, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 89dd6f105e7b3b8391359e6ae9e4933451c16ae68aa39ed85bc6c82e400c2d37> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-id1vEF57O4ORNZ5q6eSTNFHBauaKo57YW8bILkAMLTc.pb
    Sep 27, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 27, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 27, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 27, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-27_05_45_11-16125404393212011150?project=apache-beam-testing
    Sep 27, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-27_05_45_11-16125404393212011150
    Sep 27, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-27_05_45_11-16125404393212011150
    Sep 27, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-27T12:45:14.510Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:21.432Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:22.218Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:22.269Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:22.303Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:22.364Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:22.395Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:22.423Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 27, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:22.810Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 27, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:22.876Z: Starting 5 workers in us-central1-c...
    Sep 27, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:34.089Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:59.410Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 27, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:45:59.455Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 27, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:46:09.735Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2021 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:46:32.976Z: Workers have started successfully.
    Sep 27, 2021 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:46:33.009Z: Workers have started successfully.
    Sep 27, 2021 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:47:03.538Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 27, 2021 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:47:03.697Z: Cleaning up.
    Sep 27, 2021 12:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:47:03.777Z: Stopping worker pool...
    Sep 27, 2021 12:49:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:49:27.387Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2021 12:49:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T12:49:27.436Z: Worker pool stopped.
    Sep 27, 2021 12:49:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-27_05_45_11-16125404393212011150 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 867e641a-db97-4a36-a5de-ea3285f71f93 and timestamp: 2021-09-27T12:49:34.956000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     10.04

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2021 12:49:35 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 43.378 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 17s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/p7bm6iey2pxim

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2475

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2475/display/redirect>

Changes:


------------------------------------------
[...truncated 339.41 KB...]
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 83e0aceefc3fd4863bf7ab55c5815815
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 27, 2021 6:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 27, 2021 6:44:59 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 27, 2021 6:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 27, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 6:45:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1299075293]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2021 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 27, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 27, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test606067313956805319.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XVMyk8-627vyBINKnF0d9O_TbZjHcRUaLr5SgEpkflE.jar
    Sep 27, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 1 seconds
    Sep 27, 2021 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash a9bb741af712aaca2d27e2bff84804dd533a5921550fec01bb003874fa05bf5d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-qbt0GvcSqsotJ-K_-EgE3VM6WSFVD-wBuwA4dPoFv10.pb
    Sep 27, 2021 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2021 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 27, 2021 6:45:18 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 27, 2021 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 27, 2021 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-26_23_45_19-9658409184207848229?project=apache-beam-testing
    Sep 27, 2021 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-26_23_45_19-9658409184207848229
    Sep 27, 2021 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-26_23_45_19-9658409184207848229
    Sep 27, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-27T06:45:22.389Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:45:34.172Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 27, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:45:35.036Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:45:35.075Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:45:35.102Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:45:35.174Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:45:35.200Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:45:35.233Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 27, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:45:35.529Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 27, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:45:35.586Z: Starting 5 workers in us-central1-c...
    Sep 27, 2021 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:45:43.170Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2021 6:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:46:21.095Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2021 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:46:47.019Z: Workers have started successfully.
    Sep 27, 2021 6:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:46:47.055Z: Workers have started successfully.
    Sep 27, 2021 6:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:47:17.274Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 27, 2021 6:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:47:17.408Z: Cleaning up.
    Sep 27, 2021 6:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:47:17.484Z: Stopping worker pool...
    Sep 27, 2021 6:49:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:49:39.507Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2021 6:49:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T06:49:39.537Z: Worker pool stopped.
    Sep 27, 2021 6:49:45 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-26_23_45_19-9658409184207848229 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3abfee88-11a2-484c-ba29-cddc2e21f9e2 and timestamp: 2021-09-27T06:49:45.464000000Z:
                     Metric:                    Value:
                   read_time                      8.58
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2021 6:49:46 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 54.179 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 28s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/cqexpwyazocci

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2474

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2474/display/redirect>

Changes:


------------------------------------------
[...truncated 337.45 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 83e0aceefc3fd4863bf7ab55c5815815
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 27, 2021 12:44:56 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 27, 2021 12:44:57 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 27, 2021 12:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 27, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 27, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 27, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 27, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 27, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 27, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 27, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 27, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5895972536111496274.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-61IUw3jGh0-VUqGy5GTi1tycqW1iq3FKeI23tM-4hkY.jar
    Sep 27, 2021 12:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 27, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 27, 2021 12:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash f0711782d78e76ed34f7fa55c40211b2985f0b8145a086e1f24abe7d8dfe5834> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-8HEXgteOdu009_pVxAIRsphfC4FFoIbh8kq-fY3-WDQ.pb
    Sep 27, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 27, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 27, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 27, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 27, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-26_17_45_10-9883298268261907232?project=apache-beam-testing
    Sep 27, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-26_17_45_10-9883298268261907232
    Sep 27, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-26_17_45_10-9883298268261907232
    Sep 27, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-27T00:45:14.085Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 27, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:45:20.279Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 27, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:45:20.996Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 27, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:45:21.058Z: Expanding GroupByKey operations into optimizable parts.
    Sep 27, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:45:21.088Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 27, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:45:21.162Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 27, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:45:21.188Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 27, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:45:21.215Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 27, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:45:21.614Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 27, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:45:21.693Z: Starting 5 workers in us-central1-c...
    Sep 27, 2021 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:45:51.156Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 27, 2021 12:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:46:01.729Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 27, 2021 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:46:30.803Z: Workers have started successfully.
    Sep 27, 2021 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:46:30.823Z: Workers have started successfully.
    Sep 27, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:47:00.290Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 27, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:47:00.426Z: Cleaning up.
    Sep 27, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:47:00.526Z: Stopping worker pool...
    Sep 27, 2021 12:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:49:13.308Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 27, 2021 12:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-27T00:49:13.346Z: Worker pool stopped.
    Sep 27, 2021 12:49:19 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-26_17_45_10-9883298268261907232 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 78774a8c-e473-4256-a112-56fad869f6c0 and timestamp: 2021-09-27T00:49:19.665000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.946

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 27, 2021 12:49:20 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 28.152 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 2s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/yt3emrpzmp5cc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2473

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2473/display/redirect>

Changes:


------------------------------------------
[...truncated 339.99 KB...]

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 83e0aceefc3fd4863bf7ab55c5815815
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 26, 2021 6:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 26, 2021 6:45:00 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 26, 2021 6:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 26, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 26, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 26, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5132010080650681199.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-cqTi9NX-l9vIjrtS_yl5wS3CrksSOoMGbycT8bqs4l4.jar
    Sep 26, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.thrift/libthrift/0.14.1/85348a0c44c298bbec5ae747e67ae12e60b3aef6/libthrift-0.14.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/libthrift-0.14.1-WzUQ_nLm8HJeKc7269seq6zMxp15_E7VC2gWAKh2Z-w.jar
    Sep 26, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.tomcat/tomcat-annotations-api/8.5.46/56c67699de192c603afd6f029e80e5ff8d98e7e9/tomcat-annotations-api-8.5.46.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tomcat-annotations-api-8.5.46-amtG0OaVhkRRTAyjZYs7B-YSOmgqIO4203lSQnNfq8M.jar
    Sep 26, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.tomcat.embed/tomcat-embed-core/8.5.46/5d686394334d143f48251827435ab086a161e75e/tomcat-embed-core-8.5.46.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tomcat-embed-core-8.5.46-vl-FREjS7l1uADb-srT3ExYweaG2uaepdQjlWRetNcI.jar
    Sep 26, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 4 files newly uploaded in 0 seconds
    Sep 26, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash 5e3dbc930ebc167e27f19d0d5ba381333c2fb8916345d839a2d556ca42f72583> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Xj28kw68Fn4n8Z0NW6OBMzwvuJFjRdg5otVWykL3JYM.pb
    Sep 26, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 26, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 26, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 26, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-26_11_45_14-11136377410489573887?project=apache-beam-testing
    Sep 26, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-26_11_45_14-11136377410489573887
    Sep 26, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-26_11_45_14-11136377410489573887
    Sep 26, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-26T18:45:17.364Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:45:22.292Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 26, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:45:22.874Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:45:22.905Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:45:22.933Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:45:23.005Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:45:23.030Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:45:23.052Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 26, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:45:23.332Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 26, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:45:23.396Z: Starting 5 workers in us-central1-a...
    Sep 26, 2021 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:45:31.122Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2021 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:46:07.181Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2021 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:46:33.881Z: Workers have started successfully.
    Sep 26, 2021 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:46:33.914Z: Workers have started successfully.
    Sep 26, 2021 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:47:03.208Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 26, 2021 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:47:03.355Z: Cleaning up.
    Sep 26, 2021 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:47:03.422Z: Stopping worker pool...
    Sep 26, 2021 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:49:22.658Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2021 6:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T18:49:22.703Z: Worker pool stopped.
    Sep 26, 2021 6:49:28 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-26_11_45_14-11136377410489573887 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8eea85c1-1a26-469a-a8ac-1bddc55bf2a5 and timestamp: 2021-09-26T18:49:28.285000000Z:
                     Metric:                    Value:
                   read_time                     8.894
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2021 6:49:28 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 33.013 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/b44yzy35q7g6o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2472

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2472/display/redirect>

Changes:


------------------------------------------
[...truncated 337.98 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 83e0aceefc3fd4863bf7ab55c5815815
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 26, 2021 12:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 26, 2021 12:44:58 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 26, 2021 12:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 26, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 26, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 26, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 26, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1254604070143763192.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-9-kMvil9kuVjtdOuQKHYscGOFYbYN5eesmOu_omgvz8.jar
    Sep 26, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 1 seconds
    Sep 26, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash af62ebf8274598151ff04eec2336c5b03bf73f2e28da7d1c241753ab8fef334b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-r2Lr-CdFmBUf8E7sIzbFsDv3Py4o2n0cJBdTq4_vM0s.pb
    Sep 26, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 26, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 26, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 26, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-26_05_45_14-9479678245125380563?project=apache-beam-testing
    Sep 26, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-26_05_45_14-9479678245125380563
    Sep 26, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-26_05_45_14-9479678245125380563
    Sep 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-26T12:45:17.295Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:24.383Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 26, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:25.143Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:25.193Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:25.211Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:25.271Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:25.291Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:25.322Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 26, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:25.669Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 26, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:25.745Z: Starting 5 workers in us-central1-c...
    Sep 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:49.475Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:59.787Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 26, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:45:59.809Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 26, 2021 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:46:10.027Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:46:36.856Z: Workers have started successfully.
    Sep 26, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:46:36.898Z: Workers have started successfully.
    Sep 26, 2021 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:47:06.020Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 26, 2021 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:47:06.176Z: Cleaning up.
    Sep 26, 2021 12:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:47:06.257Z: Stopping worker pool...
    Sep 26, 2021 12:49:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:49:26.413Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2021 12:49:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T12:49:26.477Z: Worker pool stopped.
    Sep 26, 2021 12:49:31 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-26_05_45_14-9479678245125380563 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a8b4c07a-fbda-4b8b-ac9f-7ec72f2fd5a1 and timestamp: 2021-09-26T12:49:31.921000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     6.881

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2021 12:49:33 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 40.88 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 15s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/misvg7v45ysku

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2471

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2471/display/redirect>

Changes:


------------------------------------------
[...truncated 338.44 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 83e0aceefc3fd4863bf7ab55c5815815
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 26, 2021 6:44:56 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 26, 2021 6:44:57 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 26, 2021 6:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 26, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1505334907]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 26, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 26, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1235604829244032045.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XqowqaS49m2HdUlnmpXrWGy6UWTNhVhLxBOJFdOJcpw.jar
    Sep 26, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 26, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 045abca4714f0fae801463ce12fdfc75f5a037f7c51a41734820e0019a4ec701> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BFq8pHFPD66AFGPOEv38dfWgN_fFGkFzSCDgAZpOxwE.pb
    Sep 26, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 26, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 26, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 26, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-25_23_45_10-12170458182251827948?project=apache-beam-testing
    Sep 26, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-25_23_45_10-12170458182251827948
    Sep 26, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-25_23_45_10-12170458182251827948
    Sep 26, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-26T06:45:14.041Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:20.353Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 26, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:21.074Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:21.113Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:21.150Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:21.221Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:21.248Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:21.285Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 26, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:21.628Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 26, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:21.718Z: Starting 5 workers in us-central1-c...
    Sep 26, 2021 6:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:47.609Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2021 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:55.286Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 26, 2021 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:45:55.319Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 26, 2021 6:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:46:05.542Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2021 6:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:46:30.039Z: Workers have started successfully.
    Sep 26, 2021 6:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:46:30.079Z: Workers have started successfully.
    Sep 26, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:46:59.101Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 26, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:46:59.263Z: Cleaning up.
    Sep 26, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:46:59.367Z: Stopping worker pool...
    Sep 26, 2021 6:49:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:49:14.151Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2021 6:49:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T06:49:14.200Z: Worker pool stopped.
    Sep 26, 2021 6:49:21 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-25_23_45_10-12170458182251827948 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4558b16c-a64e-48ab-88ca-dceffb0d9a04 and timestamp: 2021-09-26T06:49:21.114000000Z:
                     Metric:                    Value:
                   read_time                     9.072
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2021 6:49:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 28.379 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 2s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/mf35lmvuobpb6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2470

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2470/display/redirect>

Changes:


------------------------------------------
[...truncated 337.75 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 83e0aceefc3fd4863bf7ab55c5815815
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 26, 2021 12:44:52 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 26, 2021 12:44:53 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 26, 2021 12:44:54 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 26, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 26, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 26, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 26, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 26, 2021 12:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 26, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 26, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 26, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1834057475545864917.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-yWIMTLhmjPJLlhf9R_2BhWwV8Bx-Bjc22jrTxmwJgo8.jar
    Sep 26, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 26, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 26, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash 844be46f876ef98404bf3f0c87ed56fedf368f27be27326fa318dfc790479a0f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hEvkb4du-YQEvz8Mh-1W_t82jye-JzJvoxjfx5BHmg8.pb
    Sep 26, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 26, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 26, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 26, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 26, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-25_17_45_06-6421286648442750099?project=apache-beam-testing
    Sep 26, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-25_17_45_06-6421286648442750099
    Sep 26, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-25_17_45_06-6421286648442750099
    Sep 26, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-26T00:45:09.573Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 26, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:45:16.083Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 26, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:45:16.818Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 26, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:45:16.847Z: Expanding GroupByKey operations into optimizable parts.
    Sep 26, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:45:16.878Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 26, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:45:16.948Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 26, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:45:16.975Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 26, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:45:17.008Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 26, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:45:17.534Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 26, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:45:17.607Z: Starting 5 workers in us-central1-a...
    Sep 26, 2021 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:45:26.661Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 26, 2021 12:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:46:03.561Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 26, 2021 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:46:30.032Z: Workers have started successfully.
    Sep 26, 2021 12:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:46:30.088Z: Workers have started successfully.
    Sep 26, 2021 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:46:58.907Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 26, 2021 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:46:59.050Z: Cleaning up.
    Sep 26, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:46:59.152Z: Stopping worker pool...
    Sep 26, 2021 12:49:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:49:20.479Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 26, 2021 12:49:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-26T00:49:20.526Z: Worker pool stopped.
    Sep 26, 2021 12:49:26 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-25_17_45_06-6421286648442750099 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9d1ce8f9-c69b-49ac-8fd8-008a50fad271 and timestamp: 2021-09-26T00:49:26.375000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.087

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 26, 2021 12:49:26 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 37.561 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 9s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/72olwmfyztj2q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2469

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2469/display/redirect>

Changes:


------------------------------------------
[...truncated 337.75 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 83e0aceefc3fd4863bf7ab55c5815815
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 25, 2021 6:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 25, 2021 6:44:55 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 25, 2021 6:44:55 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 25, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1505334907]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 25, 2021 6:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 25, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8238155918827975761.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TRur8FMsi0thdlBAFaqG7pqS8oYSIwXO8ZKbm25MraY.jar
    Sep 25, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 25, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash ed7c385e0be0647a842284b68b592f1461157ec46f401fae8892fc0e12473cda> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7Xw4XgvgZHqEIoS2i1kvFGEVfsRvQB-uiJL8DhJHPNo.pb
    Sep 25, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 25, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 25, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 25, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-25_11_45_08-4975878268342362171?project=apache-beam-testing
    Sep 25, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-25_11_45_08-4975878268342362171
    Sep 25, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-25_11_45_08-4975878268342362171
    Sep 25, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-25T18:45:12.177Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:45:17.856Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 25, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:45:18.532Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:45:18.602Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:45:18.631Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:45:18.695Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:45:18.718Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:45:18.742Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 25, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:45:19.039Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 25, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:45:19.120Z: Starting 5 workers in us-central1-a...
    Sep 25, 2021 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:45:40.300Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2021 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:46:08.797Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:46:33.495Z: Workers have started successfully.
    Sep 25, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:46:33.521Z: Workers have started successfully.
    Sep 25, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:47:06.030Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 25, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:47:06.176Z: Cleaning up.
    Sep 25, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:47:06.261Z: Stopping worker pool...
    Sep 25, 2021 6:49:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:49:31.848Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2021 6:49:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T18:49:31.886Z: Worker pool stopped.
    Sep 25, 2021 6:49:38 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-25_11_45_08-4975878268342362171 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 15db7ab9-ee8a-4504-a8a5-d08aa9191a95 and timestamp: 2021-09-25T18:49:38.775000000Z:
                     Metric:                    Value:
                   read_time                     8.479
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2021 6:49:39 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 48.54 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 20s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/enghqxrp46tki

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2468

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2468/display/redirect>

Changes:


------------------------------------------
[...truncated 343.89 KB...]
> Task :sdks:java:extensions:sql:perf-tests:testClasses
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) started.
Gradle Test Executor 3 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 83e0aceefc3fd4863bf7ab55c5815815
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 3'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 3'
Successfully started process 'Gradle Test Executor 3'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 25, 2021 12:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 25, 2021 12:45:20 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 25, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 25, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1505334907]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 12:45:27 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2021 12:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 25, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2021 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2021 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 25, 2021 12:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8001638972657287600.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-UvFX61zunvX7BuizZ3b4gxlqm0h6oZuwbc9Zdwvd_wQ.jar
    Sep 25, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 1 seconds
    Sep 25, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash 4c7c3cdddebf371f1d2b96d022fcabda988b617a94333afe45fdf50d96192006> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-THw83d6_Nx8dK5bQIvyr2piLYXqUMzr-Rf31DZYZIAY.pb
    Sep 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 25, 2021 12:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 25, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-25_05_45_36-15522118172883620923?project=apache-beam-testing
    Sep 25, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-25_05_45_36-15522118172883620923
    Sep 25, 2021 12:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-25_05_45_36-15522118172883620923
    Sep 25, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-25T12:45:39.660Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:45:44.766Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 25, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:45:45.482Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:45:45.545Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:45:45.573Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:45:45.637Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:45:45.662Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:45:45.693Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 25, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:45:46.020Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 25, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:45:46.085Z: Starting 5 workers in us-central1-a...
    Sep 25, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:45:58.575Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2021 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:46:31.503Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:46:56.523Z: Workers have started successfully.
    Sep 25, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:46:56.542Z: Workers have started successfully.
    Sep 25, 2021 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:47:24.373Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 25, 2021 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:47:24.492Z: Cleaning up.
    Sep 25, 2021 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:47:24.551Z: Stopping worker pool...
    Sep 25, 2021 12:49:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:49:45.782Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2021 12:49:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T12:49:45.825Z: Worker pool stopped.
    Sep 25, 2021 12:49:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-25_05_45_36-15522118172883620923 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4944706f-133e-4e6a-b59a-da64c29a8332 and timestamp: 2021-09-25T12:49:50.829000000Z:
                     Metric:                    Value:
                   read_time                      8.49
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2021 12:49:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 35.131 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 31s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/4euke2lgttdmw

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2467

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2467/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11982] Java Spanner - Implement IO Request Count metrics (#15493)


------------------------------------------
[...truncated 352.65 KB...]
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 83e0aceefc3fd4863bf7ab55c5815815
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 4'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 4'
Successfully started process 'Gradle Test Executor 4'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 25, 2021 6:46:25 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 25, 2021 6:46:25 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 25, 2021 6:46:26 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 25, 2021 6:46:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 6:46:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:46:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 6:46:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 6:46:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:46:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 6:46:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1505334907]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2021 6:46:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2021 6:46:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:46:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 6:46:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2021 6:46:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:46:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 6:46:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2021 6:46:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 6:46:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:46:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 6:46:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 6:46:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 6:46:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 6:46:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2021 6:46:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 25, 2021 6:46:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2021 6:46:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2021 6:46:37 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-xiAgUZ3f8CgUmkhSZWRg_5Fz0oYmRVjZGHVW_XRaCdY.jar
    Sep 25, 2021 6:46:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2940449598606000992.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-v42ycm5qXd3fLwhQVOLmUpUDz8Xi7obUeUNvBkWeEqU.jar
    Sep 25, 2021 6:46:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 25, 2021 6:46:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2021 6:46:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103999 bytes, hash bfd1edf3246b3d51bbbe1f571961bc8b8ebff57fd77c7c51f34e16250e8a6cac> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-v9Ht8yRrPVG7vh9XGWG8i46_9X_XfHxR804WJQ6KbKw.pb
    Sep 25, 2021 6:46:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2021 6:46:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 25, 2021 6:46:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 25, 2021 6:46:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 25, 2021 6:46:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-24_23_46_40-77168588993866869?project=apache-beam-testing
    Sep 25, 2021 6:46:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-24_23_46_40-77168588993866869
    Sep 25, 2021 6:46:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-24_23_46_40-77168588993866869
    Sep 25, 2021 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-25T06:46:44.084Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2021 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:46:49.438Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 25, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:46:50.180Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:46:50.215Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:46:50.244Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:46:50.299Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:46:50.322Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:46:50.352Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 25, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:46:50.653Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 25, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:46:50.718Z: Starting 5 workers in us-central1-a...
    Sep 25, 2021 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:46:56.641Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2021 6:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:47:25.066Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 25, 2021 6:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:47:25.096Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 25, 2021 6:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:47:35.472Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 25, 2021 6:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:47:35.496Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 25, 2021 6:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:47:45.830Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2021 6:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:48:01.677Z: Workers have started successfully.
    Sep 25, 2021 6:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:48:01.702Z: Workers have started successfully.
    Sep 25, 2021 6:48:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:48:34.761Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 25, 2021 6:48:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:48:34.903Z: Cleaning up.
    Sep 25, 2021 6:48:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:48:34.975Z: Stopping worker pool...
    Sep 25, 2021 6:50:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:50:53.573Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2021 6:50:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T06:50:53.617Z: Worker pool stopped.
    Sep 25, 2021 6:50:58 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-24_23_46_40-77168588993866869 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d7b997e1-445c-410c-972d-ea7683bbeed8 and timestamp: 2021-09-25T06:50:58.986000000Z:
                     Metric:                    Value:
                   read_time                    13.598
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2021 6:50:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.21 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 38.103 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 37s
152 actionable tasks: 104 executed, 48 from cache

Publishing build scan...
https://gradle.com/s/ott5u3qmewgqc

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2466

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2466/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-12934] Use environment capabilities to determine length prefixing.

[Brian Hulette] Add missing comma

[ryanthompson591] Update tensorflow to version 2.6.0

[zhoufek] [BEAM-9487] Fix incorrected Repeatedly.may_finish implementation

[noreply] Cleanup use of futures. (#15043)

[noreply] Go Lint fix for wordcount and metrics (#15580)

[Brian Hulette] Fix whitespace lint

[noreply] [BEAM-3304] Snippets for trigger in BPG (#15409)

[noreply] [BEAM-12832] Add Go SDK xlang info to programming guide. (#15447)

[noreply] [BEAM-11097] Add SideInputCache to StateReader (#15563)

[Robert Bradshaw] Revert "Avoid apiary submission of job graph when it is not needed.

[noreply] [BEAM-12769] Java-emulating external transform. (#15546)

[noreply] [BEAM-12913] Pass query priority from ReadAllFromBigQuery (#15584)


------------------------------------------
[...truncated 338.28 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) started.
Gradle Test Executor 8 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is c75df4ad0b4326aabbad29be2039d846
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 8'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 8'
Successfully started process 'Gradle Test Executor 8'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 25, 2021 12:44:38 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 25, 2021 12:44:39 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 25, 2021 12:44:40 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 25, 2021 12:44:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 12:44:42 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:44:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 12:44:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 12:44:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:44:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 12:44:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 25, 2021 12:44:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 25, 2021 12:44:50 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 25, 2021 12:44:50 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-iWELRk0J0eqj9_ooQn6KLmU0Q20eazTA8el4stWEfXY.jar
    Sep 25, 2021 12:44:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7309957586490269237.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-_WnyG8sLTgAp6Br9WwdaQ1vDWck5goL-RRfSg374y_Y.jar
    Sep 25, 2021 12:44:57 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 7 seconds
    Sep 25, 2021 12:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 25, 2021 12:44:57 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash e856d800349e70babc4d947a539bc6e2d7bd2517520d0092fdd59d3481f2ec48> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6FbYADSecLq8TZR6U5vG4te9JRdSDQCS_dWdNIHy7Eg.pb
    Sep 25, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 25, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 25, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 25, 2021 12:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 25, 2021 12:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-24_17_45_00-410929181615742579?project=apache-beam-testing
    Sep 25, 2021 12:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-24_17_45_00-410929181615742579
    Sep 25, 2021 12:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-24_17_45_00-410929181615742579
    Sep 25, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-25T00:45:03.579Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 25, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:45:34.207Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 25, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:45:35.577Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 25, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:45:36.306Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 25, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:45:36.355Z: Expanding GroupByKey operations into optimizable parts.
    Sep 25, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:45:36.390Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 25, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:45:36.488Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 25, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:45:36.514Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 25, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:45:36.550Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 25, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:45:36.936Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 25, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:45:37.010Z: Starting 5 workers in us-central1-c...
    Sep 25, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:46:19.593Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 25, 2021 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:46:45.910Z: Workers have started successfully.
    Sep 25, 2021 12:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:46:45.945Z: Workers have started successfully.
    Sep 25, 2021 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:47:18.633Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 25, 2021 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:47:18.793Z: Cleaning up.
    Sep 25, 2021 12:47:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:47:18.856Z: Stopping worker pool...
    Sep 25, 2021 12:49:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:49:29.735Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 25, 2021 12:49:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-25T00:49:29.777Z: Worker pool stopped.
    Sep 25, 2021 12:49:37 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-24_17_45_00-410929181615742579 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): cbe5e528-9c14-41a3-8dc1-48566506957a and timestamp: 2021-09-25T00:49:37.265000000Z:
                     Metric:                    Value:
                   read_time                     9.996
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 25, 2021 12:49:37 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 8 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.001 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 5 mins 2.928 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 20s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/d24zdbrmadpau

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2465

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2465/display/redirect>

Changes:


------------------------------------------
[...truncated 338.62 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is c75df4ad0b4326aabbad29be2039d846
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 24, 2021 6:44:42 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 24, 2021 6:44:42 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 24, 2021 6:44:43 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 24, 2021 6:44:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 6:44:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:44:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 6:44:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 6:44:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:44:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 6:44:47 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 6:44:49 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2021 6:44:49 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 24, 2021 6:44:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2021 6:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2021 6:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-iWELRk0J0eqj9_ooQn6KLmU0Q20eazTA8el4stWEfXY.jar
    Sep 24, 2021 6:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1284230225333265737.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-1o2h1Ljn-MnvLwScjFCA2jm-rVn_-0DVX_ewaRdj_-w.jar
    Sep 24, 2021 6:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 24, 2021 6:44:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2021 6:44:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 7eb4f8696f3cb768b6dfb6962d89952410950cb4206f3c7ed25e1d11dd0417f2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-frT4aW88t2i237aWLYmVJBCVDLQgbzx-0l4dEd0EF_I.pb
    Sep 24, 2021 6:44:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2021 6:44:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 24, 2021 6:44:55 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 24, 2021 6:44:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 24, 2021 6:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-24_11_44_55-2459813221828693126?project=apache-beam-testing
    Sep 24, 2021 6:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-24_11_44_55-2459813221828693126
    Sep 24, 2021 6:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-24_11_44_55-2459813221828693126
    Sep 24, 2021 6:44:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-24T18:44:58.962Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:06.108Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 24, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:06.973Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:07.017Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:07.060Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:07.129Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:07.165Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:07.196Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 24, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:07.613Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 24, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:07.711Z: Starting 5 workers in us-central1-c...
    Sep 24, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:18.286Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2021 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:42.476Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 24, 2021 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:42.517Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 24, 2021 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:45:52.722Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2021 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:46:20.972Z: Workers have started successfully.
    Sep 24, 2021 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:46:21.013Z: Workers have started successfully.
    Sep 24, 2021 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:46:55.603Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 24, 2021 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:46:55.788Z: Cleaning up.
    Sep 24, 2021 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:46:55.867Z: Stopping worker pool...
    Sep 24, 2021 6:49:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:49:16.481Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2021 6:49:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T18:49:16.544Z: Worker pool stopped.
    Sep 24, 2021 6:49:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-24_11_44_55-2459813221828693126 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 28209f62-5c8d-4897-a31f-500057983f68 and timestamp: 2021-09-24T18:49:27.331000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.461

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2021 6:49:27 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.008 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 49.522 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 8s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/jomwvy3bl3cyi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2464

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2464/display/redirect>

Changes:


------------------------------------------
[...truncated 337.24 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is c75df4ad0b4326aabbad29be2039d846
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 24, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 24, 2021 12:44:54 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 24, 2021 12:44:55 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 24, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1505334907]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 24, 2021 12:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-iWELRk0J0eqj9_ooQn6KLmU0Q20eazTA8el4stWEfXY.jar
    Sep 24, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8714511773009991827.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-6wFErkUUNZ4s4RVGNRiPabbe0XpmMP2R2HDyrdydQ2g.jar
    Sep 24, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 24, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash 4a59f11164f54caf0e7ddadfad8ed5a00395cc37863c94d8f4d06654dc0001fb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-SlnxEWT1TK8OfdrfrY7VoAOVzDeGPJTY9NBmVNwAAfs.pb
    Sep 24, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 24, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 24, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 24, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-24_05_45_07-6918030974945565860?project=apache-beam-testing
    Sep 24, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-24_05_45_07-6918030974945565860
    Sep 24, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-24_05_45_07-6918030974945565860
    Sep 24, 2021 12:45:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-24T12:45:13.604Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:45:18.112Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 24, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:45:18.924Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:45:18.959Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:45:18.990Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:45:19.049Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:45:19.078Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:45:19.110Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 24, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:45:19.447Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 24, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:45:19.521Z: Starting 5 workers in us-central1-a...
    Sep 24, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:45:45.706Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:46:04.128Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2021 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:46:30.352Z: Workers have started successfully.
    Sep 24, 2021 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:46:30.389Z: Workers have started successfully.
    Sep 24, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:47:00.401Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 24, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:47:00.558Z: Cleaning up.
    Sep 24, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:47:00.625Z: Stopping worker pool...
    Sep 24, 2021 12:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:49:22.960Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2021 12:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T12:49:23.011Z: Worker pool stopped.
    Sep 24, 2021 12:49:29 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-24_05_45_07-6918030974945565860 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 84b2f5f2-d035-4207-ac57-446d4eb65e0b and timestamp: 2021-09-24T12:49:29.063000000Z:
                     Metric:                    Value:
                   read_time                     9.428
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2021 12:49:29 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 39.572 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 12s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/dbm4efp37xfj2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2463

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2463/display/redirect>

Changes:


------------------------------------------
[...truncated 341.76 KB...]
> Task :sdks:java:extensions:sql:perf-tests:testClasses
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) started.
Gradle Test Executor 2 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is c75df4ad0b4326aabbad29be2039d846
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 24, 2021 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 24, 2021 6:45:22 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 24, 2021 6:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 24, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 6:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1505334907]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 6:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2021 6:45:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 24, 2021 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2021 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2021 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-iWELRk0J0eqj9_ooQn6KLmU0Q20eazTA8el4stWEfXY.jar
    Sep 24, 2021 6:45:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2515785665617398108.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-u1pRGU47V9-INDq0nZ2fj7xYX9wjxftvRxS29b0WMc8.jar
    Sep 24, 2021 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 24, 2021 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2021 6:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash bd7906fb3c8587d81c9d0a6badaf4b6f7f301a59a2735bb43cb034bb7cf1baf3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vXkG-zyFh9gcnQprra9Lb38wGlmic1u0PLA0u3zxuvM.pb
    Sep 24, 2021 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2021 6:45:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 24, 2021 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 24, 2021 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 24, 2021 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-23_23_45_38-4999594876451740195?project=apache-beam-testing
    Sep 24, 2021 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-23_23_45_38-4999594876451740195
    Sep 24, 2021 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-23_23_45_38-4999594876451740195
    Sep 24, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-24T06:45:41.792Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2021 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:45:50.479Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 24, 2021 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:45:51.390Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2021 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:45:51.434Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2021 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:45:51.472Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2021 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:45:51.588Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2021 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:45:51.624Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2021 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:45:51.663Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 24, 2021 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:45:52.082Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 24, 2021 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:45:52.187Z: Starting 5 workers in us-central1-c...
    Sep 24, 2021 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:45:56.065Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2021 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:46:34.168Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:46:59.714Z: Workers have started successfully.
    Sep 24, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:46:59.740Z: Workers have started successfully.
    Sep 24, 2021 6:47:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:47:30.615Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 24, 2021 6:47:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:47:30.823Z: Cleaning up.
    Sep 24, 2021 6:47:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:47:30.918Z: Stopping worker pool...
    Sep 24, 2021 6:49:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:49:53.558Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2021 6:49:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T06:49:53.611Z: Worker pool stopped.
    Sep 24, 2021 6:49:59 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-23_23_45_38-4999594876451740195 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9c9b8c78-6057-483d-bc43-3e94b030d40a and timestamp: 2021-09-24T06:49:59.724000000Z:
                     Metric:                    Value:
                   read_time                     9.254
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2021 6:50:00 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 42.017 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 40s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/wbq5hydbz3fqi

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2462

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2462/display/redirect?page=changes>

Changes:

[Brian Hulette] Add back 'more' break

[chuck.yang] Run BigQuery queries with batch priority

[chuck.yang] Allow changing query priority in ReadFromBigQuery

[chuck.yang] Update CHANGES

[alexander.chermenin] [BEAM-10822] Fixed typo in BigqueryClient

[zyichi] [BEAM-12898] Minor fix to Kubernetes.groovy postBuildScript.

[noreply] [BEAM-12869] Bump tensorflow from 2.5.0 to 2.5.1 in

[noreply] [BEAM-12946] Fix SmallestPerKey, add unit testing (#15567)

[zyichi] [BEAM-12908] Sickbay pubsublite.ReadWriteIT

[noreply] Relocate Go SDK breaking change note out of template into 2.33.0.


------------------------------------------
[...truncated 340.76 KB...]
> Task :sdks:java:extensions:sql:perf-tests:testClasses
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker Thread 31,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker Thread 31,5,main]) started.
Gradle Test Executor 57 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is c75df4ad0b4326aabbad29be2039d846
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 57'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 57'
Successfully started process 'Gradle Test Executor 57'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 24, 2021 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 24, 2021 12:45:19 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 24, 2021 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 24, 2021 12:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 12:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1299075293]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 24, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 24, 2021 12:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 24, 2021 12:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 24, 2021 12:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 24, 2021 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 24, 2021 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-iWELRk0J0eqj9_ooQn6KLmU0Q20eazTA8el4stWEfXY.jar
    Sep 24, 2021 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3067383912730159324.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RDIeCsZwHBmQy94ZoMBrlbsiVS69QscI-QVNIZ6ntMA.jar
    Sep 24, 2021 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 24, 2021 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 24, 2021 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 18c0661a19c3a6b5bfe0fa3d91a384419be310a1070c2e0344467c82697f0056> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-GMBmGhnDprW_4Po9kaOEQZvjEKEHDC4DREZ8gml_AFY.pb
    Sep 24, 2021 12:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 24, 2021 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 24, 2021 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 24, 2021 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 24, 2021 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-23_17_45_32-14184398797648112020?project=apache-beam-testing
    Sep 24, 2021 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-23_17_45_32-14184398797648112020
    Sep 24, 2021 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-23_17_45_32-14184398797648112020
    Sep 24, 2021 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-24T00:45:38.208Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 24, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:45:43.366Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 24, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:45:44.025Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 24, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:45:44.075Z: Expanding GroupByKey operations into optimizable parts.
    Sep 24, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:45:44.115Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 24, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:45:44.187Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 24, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:45:44.214Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 24, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:45:44.247Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 24, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:45:44.588Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 24, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:45:44.641Z: Starting 5 workers in us-central1-a...
    Sep 24, 2021 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:46:05.709Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 24, 2021 12:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:46:29.209Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 24, 2021 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:46:55.420Z: Workers have started successfully.
    Sep 24, 2021 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:46:55.456Z: Workers have started successfully.
    Sep 24, 2021 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:47:24.505Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 24, 2021 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:47:24.647Z: Cleaning up.
    Sep 24, 2021 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:47:24.721Z: Stopping worker pool...
    Sep 24, 2021 12:49:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:49:43.602Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 24, 2021 12:49:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-24T00:49:43.645Z: Worker pool stopped.
    Sep 24, 2021 12:49:50 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-23_17_45_32-14184398797648112020 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c9e709a3-21e4-4373-be6a-7d4e5ad078bd and timestamp: 2021-09-24T00:49:50.754000000Z:
                     Metric:                    Value:
                   read_time                       9.0
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 24, 2021 12:49:51 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 57 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.006 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.004 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker Thread 31,5,main]) completed. Took 4 mins 36.6 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 31s
152 actionable tasks: 100 executed, 52 from cache

Publishing build scan...
https://gradle.com/s/qcqjg3bvildw2

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2461

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2461/display/redirect?page=changes>

Changes:

[noreply] Avoid setting empty builders in proto setters

[Robert Bradshaw] Fix some website logos.

[kawaigin] Implicitly watch and track anonymous pipeline and PCollections

[chamikaramj] Reject requests when parameter names cannot be validated unless

[chamikaramj] Updates error message


------------------------------------------
[...truncated 352.53 KB...]
Gradle Test Executor 3 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 23, 2021 6:47:08 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 23, 2021 6:47:09 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 23, 2021 6:47:10 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 23, 2021 6:47:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 6:47:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:47:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 6:47:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 6:47:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:47:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 6:47:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1741599875]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 6:47:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1261942871]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2021 6:47:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 23, 2021 6:47:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2021 6:47:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2021 6:47:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-iWELRk0J0eqj9_ooQn6KLmU0Q20eazTA8el4stWEfXY.jar
    Sep 23, 2021 6:47:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-iWELRk0J0eqj9_ooQn6KLmU0Q20eazTA8el4stWEfXY.jar
    Sep 23, 2021 6:47:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2936684944477571990.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-CIXDPNEZW4V5N8CqNLzcgGSVUVSm7_dZ4ZyZ2qG_UU0.jar
    Sep 23, 2021 6:47:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.34.0-SNAPSHOT-vZyLxaaCl52ich-R-UQheFzpjD42oKouyyEoIx03QC4.jar
    Sep 23, 2021 6:47:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.34.0-SNAPSHOT-rHNNJt8fKjoZ7FjgFCzASOqHR3-JCxYQBb_7vqA7ZtE.jar
    Sep 23, 2021 6:47:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 4 files newly uploaded in 2 seconds
    Sep 23, 2021 6:47:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2021 6:47:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 027ad43ceaa0e50b63f22db5cc5fa88a58b4726aebbc49592b8a841efc2d83ed> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-AnrUPOqg5Qtj8i21zF-oili0cmrrvElZK4qEHvwtg-0.pb
    Sep 23, 2021 6:47:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2021 6:47:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 23, 2021 6:47:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 23, 2021 6:47:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 23, 2021 6:47:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-23_11_47_29-5175217799069690722?project=apache-beam-testing
    Sep 23, 2021 6:47:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-23_11_47_29-5175217799069690722
    Sep 23, 2021 6:47:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-23_11_47_29-5175217799069690722
    Sep 23, 2021 6:47:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-23T18:47:32.940Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2021 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:47:39.547Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 23, 2021 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:47:40.285Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2021 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:47:40.326Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2021 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:47:40.376Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2021 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:47:40.458Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2021 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:47:40.488Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2021 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:47:40.513Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 23, 2021 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:47:40.855Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 23, 2021 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:47:40.933Z: Starting 5 workers in us-central1-a...
    Sep 23, 2021 6:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:47:57.740Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2021 6:48:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:48:31.167Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 23, 2021 6:48:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:48:31.192Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 23, 2021 6:48:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:48:41.526Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2021 6:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:49:03.176Z: Workers have started successfully.
    Sep 23, 2021 6:49:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:49:03.228Z: Workers have started successfully.
    Sep 23, 2021 6:49:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:49:31.385Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 23, 2021 6:49:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:49:31.526Z: Cleaning up.
    Sep 23, 2021 6:49:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:49:31.621Z: Stopping worker pool...
    Sep 23, 2021 6:51:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:51:46.747Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2021 6:51:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T18:51:46.792Z: Worker pool stopped.
    Sep 23, 2021 6:51:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-23_11_47_29-5175217799069690722 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4d760c17-9dbb-4873-8634-904f58e0e733 and timestamp: 2021-09-23T18:51:52.056000000Z:
                     Metric:                    Value:
                   read_time                     8.396
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2021 6:51:53 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.048 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.053 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 52.801 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 19s
152 actionable tasks: 103 executed, 49 from cache

Publishing build scan...
https://gradle.com/s/2edlqbjwfog4y

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2460

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2460/display/redirect>

Changes:


------------------------------------------
[...truncated 341.47 KB...]

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 23, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 23, 2021 12:45:02 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 23, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 23, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1505334907]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@40782033]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2021 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 23, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 23, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3945536414481175957.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-upxuCbr27peujoVKE7chutrtsZa5e-BThvm_dK1TBRU.jar
    Sep 23, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/thrift/build/libs/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT-tests-I-osQrG-lSEES6RD4ibuovIeVl-_U3nOEEsBLo1rYrA.jar
    Sep 23, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/thrift/build/libs/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT-4bF8QpaxkWlkdPssX5OV0S9rIzFuRU0pojrN2fBwjjk.jar
    Sep 23, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.34.0-SNAPSHOT-xz-aSlhizJi8jigGo3amZ2g_MkglE3dBulicsi2UX5U.jar
    Sep 23, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/udf/build/libs/beam-sdks-java-extensions-sql-udf-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-udf-2.34.0-SNAPSHOT-MlQQAS1Je1cH4lXtIOKM3zL6o97S5F7sJQ3zUailw08.jar
    Sep 23, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 245 files cached, 5 files newly uploaded in 1 seconds
    Sep 23, 2021 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2021 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104000 bytes, hash af9781628e59b27097793bc43892cf6bf3bc88186592c229304c113d5f985145> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-r5eBYo5ZsnCXeTvEOJLPa_O8iBhlksIpMEwRPV-YUUU.pb
    Sep 23, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 23, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 23, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 23, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-23_05_45_17-9181264945269481986?project=apache-beam-testing
    Sep 23, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-23_05_45_17-9181264945269481986
    Sep 23, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-23_05_45_17-9181264945269481986
    Sep 23, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-23T12:45:24.072Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:45:37.032Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:45:37.680Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:45:37.723Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:45:37.753Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:45:37.830Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:45:37.866Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:45:37.897Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:45:38.253Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 23, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:45:38.342Z: Starting 5 workers in us-central1-c...
    Sep 23, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:46:03.313Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:46:12.594Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 23, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:46:12.624Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 23, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:46:22.906Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:46:48.198Z: Workers have started successfully.
    Sep 23, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:46:48.235Z: Workers have started successfully.
    Sep 23, 2021 12:47:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:47:19.094Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 23, 2021 12:47:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:47:19.247Z: Cleaning up.
    Sep 23, 2021 12:47:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:47:19.335Z: Stopping worker pool...
    Sep 23, 2021 12:49:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:49:50.498Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2021 12:49:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T12:49:50.548Z: Worker pool stopped.
    Sep 23, 2021 12:49:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-23_05_45_17-9181264945269481986 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6df0a03d-f87a-439f-a8e9-2a9cc96fc940 and timestamp: 2021-09-23T12:49:58.438000000Z:
                     Metric:                    Value:
                   read_time                     8.838
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2021 12:49:58 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.067 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 5 mins 1.666 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 34s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/b6ar7voroa7cm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2459

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2459/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12383] Adding Go SDK and Kafka IO to Gradle cross-language test


------------------------------------------
[...truncated 340.97 KB...]
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) started.
Gradle Test Executor 2 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is ef521024ff2c41cc9caf5e1ede2e99a5
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 23, 2021 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 23, 2021 6:45:20 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 23, 2021 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 23, 2021 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@945711342]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1394176916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2021 6:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 6:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 6:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 6:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 6:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 6:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2021 6:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 23, 2021 6:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 23, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5910955340003637270.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-g4Rx_5rhi-UXfSpycV8e91lWcnG12qQ71zBea2Ze0mI.jar
    Sep 23, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.12.7/1f45c6a397feeb46da75425619333d1cc6f90f78/mongo-java-driver-3.12.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.12.7-D_zgBJWNb9mzmuetJ37a0X9XtpcfSGsXYpxe6eE8Tao.jar
    Sep 23, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 248 files cached, 2 files newly uploaded in 0 seconds
    Sep 23, 2021 6:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash b669c72af7daf9ece45c5f55b2c8232ff9275730054fa0808ad3cd9b2e0c2d90> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tmnHKvfa-ezkXF9VssgjL_knVzAFT6CAitPNmy4MLZA.pb
    Sep 23, 2021 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2021 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 23, 2021 6:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 23, 2021 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 23, 2021 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-22_23_45_34-17970570427316720099?project=apache-beam-testing
    Sep 23, 2021 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-22_23_45_34-17970570427316720099
    Sep 23, 2021 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-22_23_45_34-17970570427316720099
    Sep 23, 2021 6:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-23T06:45:38.771Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:45:45.510Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 23, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:45:46.202Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:45:46.231Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:45:46.261Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:45:46.326Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2021 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:45:46.357Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2021 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:45:46.390Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 23, 2021 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:45:46.751Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 23, 2021 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:45:46.819Z: Starting 5 workers in us-central1-a...
    Sep 23, 2021 6:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:45:53.710Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:46:32.942Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:46:58.443Z: Workers have started successfully.
    Sep 23, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:46:58.480Z: Workers have started successfully.
    Sep 23, 2021 6:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:47:27.001Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 23, 2021 6:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:47:27.125Z: Cleaning up.
    Sep 23, 2021 6:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:47:27.190Z: Stopping worker pool...
    Sep 23, 2021 6:49:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:49:46.637Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2021 6:49:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T06:49:46.707Z: Worker pool stopped.
    Sep 23, 2021 6:49:52 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-22_23_45_34-17970570427316720099 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 37e52fc6-fcec-47eb-95ae-a6d4e0659e53 and timestamp: 2021-09-23T06:49:52.471000000Z:
                     Metric:                    Value:
                   read_time                     9.162
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2021 6:49:52 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 37.547 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 33s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/ewao26yabe3tm

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2458

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2458/display/redirect?page=changes>

Changes:

[kileysok] Add MapState and SetState support

[Kyle Weaver] [BEAM-12898] Use new postbuildscript dsl in Flink load tests

[Kyle Weaver] run postbuild regardless of test result

[Kyle Weaver] spotless

[noreply] Merge pull request #15487 from [BEAM-12812] - Run Github Actions on GCP

[noreply] [BEAM-11097] Add SideInputCache to harness control type (#15530)

[kawaigin] Updated interactive integration test golden screenshots.

[noreply] Revert PR 15487 (BEAM-12812) (#15554)

[zyichi] [BEAM-12898] Trying out solution suggestion for JENKINS-66189 to solve


------------------------------------------
[...truncated 340.82 KB...]
> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 52ebef38e18ea6b3c79e97ca10aec7b1
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 23, 2021 12:46:56 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 23, 2021 12:46:57 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 23, 2021 12:46:57 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 23, 2021 12:47:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 12:47:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:47:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 12:47:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 12:47:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:47:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 12:47:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@945711342]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 23, 2021 12:47:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2021 12:47:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:47:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 12:47:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 23, 2021 12:47:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:47:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 12:47:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1394176916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2021 12:47:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 12:47:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:47:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 23, 2021 12:47:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 23, 2021 12:47:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 23, 2021 12:47:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 23, 2021 12:47:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 23, 2021 12:47:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 23, 2021 12:47:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 23, 2021 12:47:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 23, 2021 12:47:06 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 23, 2021 12:47:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5184320067722493232.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-az2qEX0NxQwDhCTXI2k7ebhcmbbYnVpTPUkWAcHylm4.jar
    Sep 23, 2021 12:47:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-bigtable-emulator/0.137.1/b1639aa134de1302e43d9e9c3843f6ff853c510f/google-cloud-bigtable-emulator-0.137.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-bigtable-emulator-0.137.1-V2SlLU4BjIGf5QoF7YL-A-QNjw49NdXrNM4QoX9MXSI.jar
    Sep 23, 2021 12:47:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 248 files cached, 2 files newly uploaded in 2 seconds
    Sep 23, 2021 12:47:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 23, 2021 12:47:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103999 bytes, hash 11b94f5423a572a9f59f1157c15775ecabb636195aa71bd2dc2e72adbf45c43b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EblPVCOlcqn1nxFXwVd17Ku2NhlapxvS3C5yrb9FxDs.pb
    Sep 23, 2021 12:47:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 23, 2021 12:47:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 23, 2021 12:47:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 23, 2021 12:47:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 23, 2021 12:47:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-22_17_47_11-3244588918368151576?project=apache-beam-testing
    Sep 23, 2021 12:47:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-22_17_47_11-3244588918368151576
    Sep 23, 2021 12:47:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-22_17_47_11-3244588918368151576
    Sep 23, 2021 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-23T00:47:15.522Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 23, 2021 12:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:47:23.824Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 23, 2021 12:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:47:24.603Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 23, 2021 12:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:47:24.642Z: Expanding GroupByKey operations into optimizable parts.
    Sep 23, 2021 12:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:47:24.672Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 23, 2021 12:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:47:24.767Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 23, 2021 12:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:47:24.806Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 23, 2021 12:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:47:24.845Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 23, 2021 12:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:47:25.323Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 23, 2021 12:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:47:25.411Z: Starting 5 workers in us-central1-c...
    Sep 23, 2021 12:47:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:47:48.955Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 23, 2021 12:48:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:48:01.559Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 23, 2021 12:48:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:48:01.614Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 23, 2021 12:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:48:11.900Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 23, 2021 12:48:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:48:38.117Z: Workers have started successfully.
    Sep 23, 2021 12:48:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:48:38.151Z: Workers have started successfully.
    Sep 23, 2021 12:49:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:49:09.689Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 23, 2021 12:49:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:49:09.869Z: Cleaning up.
    Sep 23, 2021 12:49:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:49:09.963Z: Stopping worker pool...
    Sep 23, 2021 12:51:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:51:37.848Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 23, 2021 12:51:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-23T00:51:37.925Z: Worker pool stopped.
    Sep 23, 2021 12:51:45 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-22_17_47_11-3244588918368151576 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0adb44ea-5b14-4349-8bae-16d579796dd5 and timestamp: 2021-09-23T00:51:45.159000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.401

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 23, 2021 12:51:45 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 53.002 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 50s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/gidlazahzrocc

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2457

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2457/display/redirect>

Changes:


------------------------------------------
[...truncated 338.67 KB...]
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 184d1d1e4a85b12c1e9374eee179eec0
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 22, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 22, 2021 6:44:58 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 22, 2021 6:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 22, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@784892670]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1151017424]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 22, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 22, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1244217088328882518.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-gaHEQFRtP_RhA-CnXy22XXTvOIgKB8QuwHJmhQY5QcM.jar
    Sep 22, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 22, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash b914cb50a0d09d4ec672e60bd0efed8060794a88f55dcb567218c7faff2d154b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-uRTLUKDQnU7GcuYL0O_tgGB5Soj1XctWchjH-v8tFUs.pb
    Sep 22, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 22, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 22, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 22, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-22_11_45_12-13103536790849586117?project=apache-beam-testing
    Sep 22, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-22_11_45_12-13103536790849586117
    Sep 22, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-22_11_45_12-13103536790849586117
    Sep 22, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-22T18:45:15.394Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:23.116Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 22, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:23.700Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:23.744Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:23.782Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:23.874Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:23.914Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:23.948Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 22, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:24.342Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 22, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:24.427Z: Starting 5 workers in us-central1-c...
    Sep 22, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:36.599Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2021 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:59.901Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 22, 2021 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:45:59.954Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 22, 2021 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:46:10.241Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2021 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:46:35.293Z: Workers have started successfully.
    Sep 22, 2021 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:46:35.318Z: Workers have started successfully.
    Sep 22, 2021 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:47:07.293Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 22, 2021 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:47:07.461Z: Cleaning up.
    Sep 22, 2021 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:47:07.542Z: Stopping worker pool...
    Sep 22, 2021 6:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:49:30.694Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2021 6:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T18:49:30.743Z: Worker pool stopped.
    Sep 22, 2021 6:49:37 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-22_11_45_12-13103536790849586117 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2a3475f2-b7be-478c-ab0d-6c180f4e11af and timestamp: 2021-09-22T18:49:37.336000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.199

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2021 6:49:37 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.348 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 45.159 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 18s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/a4t4wz5trhy74

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2456

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2456/display/redirect?page=changes>

Changes:

[danthev] Fix 2.32.0 release notes.


------------------------------------------
[...truncated 337.21 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 184d1d1e4a85b12c1e9374eee179eec0
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 22, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 22, 2021 12:44:54 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 22, 2021 12:44:55 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 22, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@945711342]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1394176916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 22, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 22, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1080326950153223984.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-E0JAGKSgqH47A6_9SBeNMJe8OWoJ0gbHg_d6TMB2vVw.jar
    Sep 22, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 22, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 73213c3a9e936855b3bb3a48c7e5adb4a07f9b322108c820f91b481ec04af87f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cyE8Op6TaFWzuzpIx-WttKB_mzIhCMgg-RtIHsBK-H8.pb
    Sep 22, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 22, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 22, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 22, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-22_05_45_09-15371512024525983311?project=apache-beam-testing
    Sep 22, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-22_05_45_09-15371512024525983311
    Sep 22, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-22_05_45_09-15371512024525983311
    Sep 22, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-22T12:45:12.484Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:45:20.382Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:45:21.185Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:45:21.234Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:45:21.269Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:45:21.352Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:45:21.391Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:45:21.422Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:45:21.880Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 22, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:45:21.957Z: Starting 5 workers in us-central1-c...
    Sep 22, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:45:50.153Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:46:11.003Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:46:36.335Z: Workers have started successfully.
    Sep 22, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:46:36.366Z: Workers have started successfully.
    Sep 22, 2021 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:47:07.601Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 22, 2021 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:47:07.759Z: Cleaning up.
    Sep 22, 2021 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:47:07.848Z: Stopping worker pool...
    Sep 22, 2021 12:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:49:29.844Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2021 12:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T12:49:29.906Z: Worker pool stopped.
    Sep 22, 2021 12:49:35 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-22_05_45_09-15371512024525983311 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 36c0a812-054a-401b-bf6a-0b82689eea44 and timestamp: 2021-09-22T12:49:35.225000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.325

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2021 12:49:35 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 45.649 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 17s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/uaklsmfn5dgek

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2455

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2455/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #15537 from [BEAM-12908] Add a sleep to the IT after


------------------------------------------
[...truncated 337.68 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 184d1d1e4a85b12c1e9374eee179eec0
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 22, 2021 6:44:53 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 22, 2021 6:44:54 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 22, 2021 6:44:55 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 22, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@784892670]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1151017424]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 22, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 22, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test171101041235627558.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-NU1LvmrDW5YH4cBRSQX_zfD2xwsH9tyX1bR9_ahgIkI.jar
    Sep 22, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 22, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash e6eb149ebfe9b6e0adf5512916244aa35a3f7f59620b42cbcc6dd0a5874c9e02> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-5usUnr_ptuCt9VEpFiRKo1o_f1liC0LLzG3QpYdMngI.pb
    Sep 22, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 22, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 22, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 22, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-21_23_45_08-831295087074972985?project=apache-beam-testing
    Sep 22, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-21_23_45_08-831295087074972985
    Sep 22, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-21_23_45_08-831295087074972985
    Sep 22, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-22T06:45:11.641Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:45:17.337Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 22, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:45:18.129Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:45:18.166Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:45:18.191Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:45:18.257Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:45:18.294Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:45:18.323Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 22, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:45:18.730Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 22, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:45:18.808Z: Starting 5 workers in us-central1-a...
    Sep 22, 2021 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:45:49.475Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2021 6:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:46:03.070Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:46:29.367Z: Workers have started successfully.
    Sep 22, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:46:29.427Z: Workers have started successfully.
    Sep 22, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:46:58.337Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 22, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:46:58.483Z: Cleaning up.
    Sep 22, 2021 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:46:58.566Z: Stopping worker pool...
    Sep 22, 2021 6:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:49:23.097Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2021 6:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T06:49:23.169Z: Worker pool stopped.
    Sep 22, 2021 6:49:30 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-21_23_45_08-831295087074972985 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7ee577c3-a708-4b2f-8b17-aca66c653bcc and timestamp: 2021-09-22T06:49:30.654000000Z:
                     Metric:                    Value:
                   read_time                     8.108
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2021 6:49:31 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 41.272 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 13s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/iwf5unyl6uklk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2454

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2454/display/redirect?page=changes>

Changes:

[kawaigin] [BEAM-10708] Added an example notebook for beam_sql magic

[noreply] Add a timeout for BQ streaming_insert RPCS (#15541)


------------------------------------------
[...truncated 337.32 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is af5aef652cb0ffd2e4d4d1a71b16ad28
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 22, 2021 12:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 22, 2021 12:44:59 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 22, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 22, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@784892670]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 22, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 22, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1151017424]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 22, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 22, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 22, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 22, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 22, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 22, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 22, 2021 12:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 22, 2021 12:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 22, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4274549280785273227.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-d9SpcQgcM0lltFLrjsbMZhYEmwb_zY2Ng-lF5yftMSc.jar
    Sep 22, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 22, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 22, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 688b941c99458e9229a06f828b9731d51d518db03490136639a5a3539100c897> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aIuUHJlFjpIpoG-Ci5cx1R1RjbA0kBNmOaWjU5EAyJc.pb
    Sep 22, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 22, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 22, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 22, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 22, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-21_17_45_12-18409918559903482452?project=apache-beam-testing
    Sep 22, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-21_17_45_12-18409918559903482452
    Sep 22, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-21_17_45_12-18409918559903482452
    Sep 22, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-22T00:45:15.552Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 22, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:45:21.291Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 22, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:45:22.034Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 22, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:45:22.067Z: Expanding GroupByKey operations into optimizable parts.
    Sep 22, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:45:22.126Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 22, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:45:22.201Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 22, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:45:22.230Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 22, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:45:22.260Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 22, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:45:22.650Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 22, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:45:22.722Z: Starting 5 workers in us-central1-a...
    Sep 22, 2021 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:45:32.085Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 22, 2021 12:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:46:01.335Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 22, 2021 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:46:34.996Z: Workers have started successfully.
    Sep 22, 2021 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:46:35.020Z: Workers have started successfully.
    Sep 22, 2021 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:47:04.530Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 22, 2021 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:47:04.660Z: Cleaning up.
    Sep 22, 2021 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:47:04.742Z: Stopping worker pool...
    Sep 22, 2021 12:49:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:49:28.088Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 22, 2021 12:49:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-22T00:49:28.131Z: Worker pool stopped.
    Sep 22, 2021 12:49:35 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-21_17_45_12-18409918559903482452 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): daeb5ee0-018d-43be-97f3-60ac719aa1a7 and timestamp: 2021-09-22T00:49:35.590000000Z:
                     Metric:                    Value:
                   read_time                     8.915
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 22, 2021 12:49:36 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 41.029 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 14s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/4grmsoriyf5so

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2453

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2453/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12919] Removed IBM Streams from runner matrix (#15542)

[noreply] [BEAM-12258] Re-throw exception from forked thread in


------------------------------------------
[...truncated 340.65 KB...]
> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is af5aef652cb0ffd2e4d4d1a71b16ad28
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 21, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 21, 2021 6:45:08 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 21, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 21, 2021 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 6:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@945711342]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1394176916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 6:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2021 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 21, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 21, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests-5kU7Hyrl28H7hvdoTVd39uKp33WIlhc8ZD64bUU4wzA.jar
    Sep 21, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8522678397766649911.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-nzjuqmCa_lTUZ040_QVeqJtii2ZKtrI20iTcL37msQo.jar
    Sep 21, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 248 files cached, 2 files newly uploaded in 0 seconds
    Sep 21, 2021 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash abdf42e26da4175cb906cbce369ef9635e6de55d2821827503a4577c62fe3847> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-q99C4m2kF1y5BsvONp75Y15t5V0oIYJ1A6RXfGL-OEc.pb
    Sep 21, 2021 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2021 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 21, 2021 6:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 21, 2021 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 21, 2021 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-21_11_45_22-705998211072244175?project=apache-beam-testing
    Sep 21, 2021 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-21_11_45_22-705998211072244175
    Sep 21, 2021 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-21_11_45_22-705998211072244175
    Sep 21, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-21T18:45:25.500Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:45:31.316Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 21, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:45:31.975Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:45:32.015Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:45:32.045Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:45:32.126Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:45:32.158Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:45:32.183Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 21, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:45:32.517Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 21, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:45:32.590Z: Starting 5 workers in us-central1-a...
    Sep 21, 2021 6:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:46:00.782Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 21, 2021 6:46:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:46:00.808Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 21, 2021 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:46:06.367Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2021 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:46:11.035Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2021 6:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:46:36.825Z: Workers have started successfully.
    Sep 21, 2021 6:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:46:36.860Z: Workers have started successfully.
    Sep 21, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:47:04.617Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 21, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:47:04.790Z: Cleaning up.
    Sep 21, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:47:04.862Z: Stopping worker pool...
    Sep 21, 2021 6:49:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:49:41.520Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2021 6:49:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T18:49:41.562Z: Worker pool stopped.
    Sep 21, 2021 6:49:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-21_11_45_22-705998211072244175 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d2e5245a-460d-4f4e-94b8-a110f52be334 and timestamp: 2021-09-21T18:49:47.959000000Z:
                     Metric:                    Value:
                   read_time                     7.208
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2021 6:49:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 44.309 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 31s
152 actionable tasks: 97 executed, 55 from cache

Publishing build scan...
https://gradle.com/s/22wry22hcx2ji

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2452

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2452/display/redirect>

Changes:


------------------------------------------
[...truncated 336.56 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is e50055417125b31f26e1108fef7d1958
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 21, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 21, 2021 12:44:55 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 21, 2021 12:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 21, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@945711342]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1394176916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 21, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 21, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9199631328302929270.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qtJQ1nU_LVwc-bMkv5pcCvxZv0ZAzYRLUG99Adr53Bo.jar
    Sep 21, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 21, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash e34f7296faccfaf7745513c9c8f6c117f387ad5a10c8133a3c858ea05cb84898> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-409ylvrM-vd0VRPJyPbBF_OHrVoQyBM6PIWOoFy4SJg.pb
    Sep 21, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 21, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 21, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 21, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-21_05_45_09-4200941418815551979?project=apache-beam-testing
    Sep 21, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-21_05_45_09-4200941418815551979
    Sep 21, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-21_05_45_09-4200941418815551979
    Sep 21, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-21T12:45:12.560Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:45:18.858Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:45:19.664Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:45:19.698Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:45:19.734Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:45:19.830Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:45:19.869Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:45:19.903Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:45:20.279Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 21, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:45:20.361Z: Starting 5 workers in us-central1-c...
    Sep 21, 2021 12:45:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:45:33.390Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:46:07.964Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:46:35.443Z: Workers have started successfully.
    Sep 21, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:46:35.474Z: Workers have started successfully.
    Sep 21, 2021 12:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:47:06.566Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 21, 2021 12:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:47:06.787Z: Cleaning up.
    Sep 21, 2021 12:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:47:06.917Z: Stopping worker pool...
    Sep 21, 2021 12:50:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:50:05.289Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2021 12:50:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T12:50:05.330Z: Worker pool stopped.
    Sep 21, 2021 12:50:11 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-21_05_45_09-4200941418815551979 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 93904afc-5048-4ad6-8983-056c3298e888 and timestamp: 2021-09-21T12:50:11.911000000Z:
                     Metric:                    Value:
                   read_time                     9.809
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2021 12:50:12 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 5 mins 21.387 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 53s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/mt2hzwi53hric

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2451

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2451/display/redirect?page=changes>

Changes:

[chamikaramj] Python support for directly using Java transforms using constructor and

[chamikaramj] Fixes yapf

[chamikaramj] Fixes lint

[chamikaramj] Addressing reviewer comments

[chamikaramj] Adds support for a field name format that will be ignored at expansion

[chamikaramj] Addresses reviewer comments

[chamikaramj] Use correct ignore field prefix in Python side


------------------------------------------
[...truncated 339.99 KB...]
> Task :sdks:java:extensions:sql:perf-tests:testClasses
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) started.
Gradle Test Executor 9 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is e50055417125b31f26e1108fef7d1958
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 9'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 9'
Successfully started process 'Gradle Test Executor 9'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 21, 2021 6:45:57 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 21, 2021 6:45:58 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 21, 2021 6:45:58 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 21, 2021 6:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 6:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@784892670]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1151017424]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 6:46:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 6:46:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2021 6:46:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 21, 2021 6:46:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2021 6:46:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2021 6:46:07 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 21, 2021 6:46:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8512241474882420204.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-N3T6yMXWd4vU9P2BPRbMJm0QgxBhdlJ15cxIuaPmGqQ.jar
    Sep 21, 2021 6:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 21, 2021 6:46:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2021 6:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash c124355bc851def2d9176381e34ffe6d6e29ff51e60486af60a828cc892ea38d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wSQ1W8hR3vLZF2OB40_-bW4p_1HmBIavYKgozIkuo40.pb
    Sep 21, 2021 6:46:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2021 6:46:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 21, 2021 6:46:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 21, 2021 6:46:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 21, 2021 6:46:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-20_23_46_10-10554845187517000315?project=apache-beam-testing
    Sep 21, 2021 6:46:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-20_23_46_10-10554845187517000315
    Sep 21, 2021 6:46:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-20_23_46_10-10554845187517000315
    Sep 21, 2021 6:46:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-21T06:46:16.062Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:46:21.626Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 21, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:46:22.494Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:46:22.531Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:46:22.558Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:46:22.643Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:46:22.691Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:46:22.714Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 21, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:46:23.040Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 21, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:46:23.102Z: Starting 5 workers in us-central1-a...
    Sep 21, 2021 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:46:42.938Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2021 6:47:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:47:06.667Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2021 6:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:47:39.878Z: Workers have started successfully.
    Sep 21, 2021 6:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:47:39.913Z: Workers have started successfully.
    Sep 21, 2021 6:48:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:48:06.920Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 21, 2021 6:48:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:48:07.036Z: Cleaning up.
    Sep 21, 2021 6:48:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:48:07.099Z: Stopping worker pool...
    Sep 21, 2021 6:50:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:50:23.433Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2021 6:50:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T06:50:23.474Z: Worker pool stopped.
    Sep 21, 2021 6:50:28 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-20_23_46_10-10554845187517000315 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1e37e075-1024-4a77-ab3e-6166dc87494d and timestamp: 2021-09-21T06:50:28.994000000Z:
                     Metric:                    Value:
                   read_time                     8.339
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2021 6:50:29 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 9 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.005 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.003 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 35.969 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 8s
152 actionable tasks: 99 executed, 53 from cache

Publishing build scan...
https://gradle.com/s/ffn5mowaadkhq

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2450

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2450/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Additional CoGBK tests.

[noreply] [BEAM-12803] Update deprecated use of _field_types (#15539)

[Robert Bradshaw] Move CoGBK tests into appropreate module.


------------------------------------------
[...truncated 338.38 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 4dccc9788914c2d27a695eb8c5d03105
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 21, 2021 12:44:51 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 21, 2021 12:44:52 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 21, 2021 12:44:52 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 21, 2021 12:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 12:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@945711342]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1394176916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 21, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 21, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 21, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 21, 2021 12:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 21, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 21, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 21, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6331853008731212232.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-YC70iJPj_Rdvnh6iM_nUeFmoKYqMzWSWPerLpAZMR9M.jar
    Sep 21, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 21, 2021 12:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 21, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 43788cc3b0ed4102ad1b13e5829121cdc205e16015bd5c53b166657a9f7eef9d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Q3iMw7DtQQKtGxPlgpEhzcIF4WAVvVxTsWZlep9-750.pb
    Sep 21, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 21, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 21, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 21, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 21, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-20_17_45_04-15111700924769977442?project=apache-beam-testing
    Sep 21, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-20_17_45_04-15111700924769977442
    Sep 21, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-20_17_45_04-15111700924769977442
    Sep 21, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-21T00:45:07.930Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 21, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:13.616Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 21, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:14.252Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 21, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:14.301Z: Expanding GroupByKey operations into optimizable parts.
    Sep 21, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:14.327Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 21, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:14.398Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 21, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:14.437Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 21, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:14.469Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 21, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:14.790Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 21, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:14.865Z: Starting 5 workers in us-central1-a...
    Sep 21, 2021 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:45.174Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2021 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:56.797Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 21, 2021 12:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:45:56.814Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 21, 2021 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:46:07.027Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 21, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:46:32.670Z: Workers have started successfully.
    Sep 21, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:46:32.698Z: Workers have started successfully.
    Sep 21, 2021 12:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:47:01.532Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 21, 2021 12:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:47:01.667Z: Cleaning up.
    Sep 21, 2021 12:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:47:01.732Z: Stopping worker pool...
    Sep 21, 2021 12:49:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:49:26.143Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 21, 2021 12:49:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-21T00:49:26.185Z: Worker pool stopped.
    Sep 21, 2021 12:49:33 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-20_17_45_04-15111700924769977442 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d992d211-7649-465a-8390-80dd62e59b54 and timestamp: 2021-09-21T00:49:33.364000000Z:
                     Metric:                    Value:
                   read_time                     8.605
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 21, 2021 12:49:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 46.265 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 16s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/qhqeuealhlp4y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2449

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2449/display/redirect?page=changes>

Changes:

[danthev] Fix flaky test.

[danthev] Fix lint errors.


------------------------------------------
[...truncated 341.10 KB...]
Gradle Test Executor 2 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 4dccc9788914c2d27a695eb8c5d03105
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 20, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 20, 2021 6:45:12 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 20, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 20, 2021 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@784892670]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1151017424]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2021 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2021 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 20, 2021 6:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 20, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6833087523479725486.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-T_fyMn4O_viNZtHxu6t49Ixm9n1a9rctQNT_ylYHCRQ.jar
    Sep 20, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 20, 2021 6:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 1e2f0f48131d49f93a6232d0f4591b2f9516fe4f6ade6fd0c0374df1eff8dd41> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Hi8PSBMdSfk6YjLQ9FkbL5UW_k9q3m_QwDdN8e_43UE.pb
    Sep 20, 2021 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2021 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 20, 2021 6:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 20, 2021 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 20, 2021 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-20_11_45_25-14236663929264241551?project=apache-beam-testing
    Sep 20, 2021 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-20_11_45_25-14236663929264241551
    Sep 20, 2021 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-20_11_45_25-14236663929264241551
    Sep 20, 2021 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-20T18:45:29.646Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2021 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:45:36.337Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 20, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:45:37.252Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:45:37.291Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:45:37.323Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:45:37.397Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:45:37.463Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:45:37.500Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 20, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:45:37.923Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 20, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:45:38.017Z: Starting 5 workers in us-central1-c...
    Sep 20, 2021 6:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:46:08.754Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2021 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:46:11.080Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 20, 2021 6:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:46:11.112Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 20, 2021 6:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:46:21.360Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2021 6:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:46:46.062Z: Workers have started successfully.
    Sep 20, 2021 6:46:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:46:46.098Z: Workers have started successfully.
    Sep 20, 2021 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:47:17.485Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 20, 2021 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:47:17.636Z: Cleaning up.
    Sep 20, 2021 6:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:47:17.735Z: Stopping worker pool...
    Sep 20, 2021 6:49:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:49:41.896Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2021 6:49:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T18:49:41.944Z: Worker pool stopped.
    Sep 20, 2021 6:49:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-20_11_45_25-14236663929264241551 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c2953af7-9466-40b3-a259-ecb9c63bd96e and timestamp: 2021-09-20T18:49:48.800000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.797

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2021 6:49:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 41.483 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 31s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/almlfcm2sl7aw

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2448

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2448/display/redirect>

Changes:


------------------------------------------
[...truncated 338.15 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 2e2b2489cf9d882242349157c9769a31
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 20, 2021 12:44:51 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 20, 2021 12:44:52 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 20, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 20, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@945711342]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1394176916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 20, 2021 12:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT-h5Vm59YQQa-RcmHCCG09Iw6skDekxZqFrPsGtnvUHB4.jar
    Sep 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7769499112067325841.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qu_X7N3QbjtJXZki0B2PaWP8mpBRnGZrS80fVAajjhY.jar
    Sep 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT-tests-Lg8Zg7us2s6VPtPB7f5WrJKFl5dhDd82wMfxmGkALkE.jar
    Sep 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 3 files newly uploaded in 0 seconds
    Sep 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 8a69b64dcbf8da07a1892edcd7d5c82dce72df1e1fad1e7147d2528951f5a94b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-imm2Tcv42gehiS7c19XILc5y3x4frR5xR9JSiVH1qUs.pb
    Sep 20, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 20, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 20, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 20, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-20_05_45_05-6250377755695887449?project=apache-beam-testing
    Sep 20, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-20_05_45_05-6250377755695887449
    Sep 20, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-20_05_45_05-6250377755695887449
    Sep 20, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-20T12:45:10.026Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2021 12:45:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:45:15.234Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 20, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:45:16.153Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:45:16.185Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:45:16.219Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:45:16.296Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:45:16.324Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:45:16.347Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 20, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:45:16.682Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 20, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:45:16.761Z: Starting 5 workers in us-central1-a...
    Sep 20, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:45:50.100Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:46:05.463Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:46:31.254Z: Workers have started successfully.
    Sep 20, 2021 12:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:46:31.284Z: Workers have started successfully.
    Sep 20, 2021 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:46:58.896Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 20, 2021 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:46:59.032Z: Cleaning up.
    Sep 20, 2021 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:46:59.103Z: Stopping worker pool...
    Sep 20, 2021 12:49:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:49:16.508Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2021 12:49:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T12:49:16.555Z: Worker pool stopped.
    Sep 20, 2021 12:49:21 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-20_05_45_05-6250377755695887449 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 88308b99-e8a2-4d5c-887d-020e57452cfa and timestamp: 2021-09-20T12:49:22.035000000Z:
                     Metric:                    Value:
                   read_time                     8.095
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2021 12:49:22 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 34.464 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 5s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/m5j2k4dd623jc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2447

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2447/display/redirect>

Changes:


------------------------------------------
[...truncated 340.32 KB...]
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 20, 2021 6:46:18 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 20, 2021 6:46:21 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 20, 2021 6:46:22 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 20, 2021 6:46:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 6:46:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:46:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 6:46:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 6:46:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:46:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 6:46:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1657634523]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 6:46:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@53688365]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2021 6:46:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 6:46:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:46:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 6:46:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 6:46:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 6:46:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 6:46:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2021 6:46:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 20, 2021 6:46:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2021 6:46:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2021 6:46:42 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 20, 2021 6:46:43 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7183188378273839349.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-mgZjTt6naUChfRxKldtKZjTKUPteKwyKmF9nMkyzV5s.jar
    Sep 20, 2021 6:46:46 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 4 seconds
    Sep 20, 2021 6:46:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2021 6:46:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash cc41988f5af6504fd7e3beb3570ca172c51ab19d5eb56a7850817ae10f0f9f93> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zEGYj1r2UE_X476zVwyhcsUasZ1etWp4UIF64Q8Pn5M.pb
    Sep 20, 2021 6:46:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2021 6:46:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 20, 2021 6:46:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 20, 2021 6:46:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 20, 2021 6:46:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-19_23_46_51-13258525420254987853?project=apache-beam-testing
    Sep 20, 2021 6:46:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-19_23_46_51-13258525420254987853
    Sep 20, 2021 6:46:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-19_23_46_51-13258525420254987853
    Sep 20, 2021 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-20T06:46:55.234Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:02.911Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 20, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:03.558Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2021 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:03.589Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2021 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:03.618Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2021 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:03.693Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2021 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:03.728Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2021 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:03.761Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 20, 2021 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:04.160Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 20, 2021 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:04.239Z: Starting 5 workers in us-central1-c...
    Sep 20, 2021 6:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:33.400Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2021 6:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:33.643Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 20, 2021 6:47:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:33.661Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 20, 2021 6:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:47:43.939Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2021 6:48:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:48:09.297Z: Workers have started successfully.
    Sep 20, 2021 6:48:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:48:09.320Z: Workers have started successfully.
    Sep 20, 2021 6:48:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:48:37.512Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 20, 2021 6:48:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:48:37.640Z: Cleaning up.
    Sep 20, 2021 6:48:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:48:37.719Z: Stopping worker pool...
    Sep 20, 2021 6:50:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:50:55.307Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2021 6:50:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T06:50:55.353Z: Worker pool stopped.
    Sep 20, 2021 6:51:02 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-19_23_46_51-13258525420254987853 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f3ce399f-6652-4a1b-84d4-dbe9723458ab and timestamp: 2021-09-20T06:51:02.597000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     6.959

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2021 6:51:03 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 55.124 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 28s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/kvqadneeh5zxe

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Mon Sep 13 06:44:48 UTC 2021.
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.209 secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2446

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2446/display/redirect>

Changes:


------------------------------------------
[...truncated 337.33 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 2e2b2489cf9d882242349157c9769a31
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 20, 2021 12:44:51 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 20, 2021 12:44:51 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 20, 2021 12:44:52 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 20, 2021 12:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 12:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@784892670]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1151017424]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 20, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 20, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 20, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 20, 2021 12:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 20, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 20, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 20, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4426132592971236140.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-QoyyI3kA-wH4jxVC-pHdnJGnQBfa-IabpqcP35qDWpw.jar
    Sep 20, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 20, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 20, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 79e55825c33f8a835ffa6548fb737e7e22279d368b1a178a8d89e1cbe4607272> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-eeVYJcM_ioNf-mVI-3N-fiInnTaLGheKjYnhy-RgcnI.pb
    Sep 20, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 20, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 20, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 20, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 20, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-19_17_45_04-16828390132500595680?project=apache-beam-testing
    Sep 20, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-19_17_45_04-16828390132500595680
    Sep 20, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-19_17_45_04-16828390132500595680
    Sep 20, 2021 12:45:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-20T00:45:08.100Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 20, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:15.297Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 20, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:16.133Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 20, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:16.173Z: Expanding GroupByKey operations into optimizable parts.
    Sep 20, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:16.209Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 20, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:16.270Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 20, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:16.295Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 20, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:16.319Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 20, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:16.599Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 20, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:16.675Z: Starting 5 workers in us-central1-c...
    Sep 20, 2021 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:32.094Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 20, 2021 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:47.154Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 20, 2021 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:47.176Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 20, 2021 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:45:57.473Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 20, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:46:24.753Z: Workers have started successfully.
    Sep 20, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:46:24.793Z: Workers have started successfully.
    Sep 20, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:46:57.674Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 20, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:46:57.839Z: Cleaning up.
    Sep 20, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:46:57.910Z: Stopping worker pool...
    Sep 20, 2021 12:49:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:49:12.687Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 20, 2021 12:49:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-20T00:49:12.730Z: Worker pool stopped.
    Sep 20, 2021 12:49:21 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-19_17_45_04-16828390132500595680 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): db669282-1e3e-44f4-8639-0640c7fa31b8 and timestamp: 2021-09-20T00:49:21.160000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.535

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 20, 2021 12:49:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 34.083 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 4s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/o6ouf5zpkyv3q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2445

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2445/display/redirect>

Changes:


------------------------------------------
[...truncated 337.53 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 2e2b2489cf9d882242349157c9769a31
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 19, 2021 6:44:53 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 19, 2021 6:44:54 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 19, 2021 6:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 19, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@945711342]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 19, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1394176916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 19, 2021 6:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 19, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1653160675735518796.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-iEaPyr1lnZzA_UmSRUXQmZqPy8kQiRwdjh7sc0PoCU0.jar
    Sep 19, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 19, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash 8d6e01620c12c377ee6239240cec7d82c62380ebf854e8819204ec5eb64893fa> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jW4BYgwSw3fuYjkkDOx9gsYjgOv4VOiBkgTsXrZIk_o.pb
    Sep 19, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 19, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 19, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 19, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-19_11_45_07-12541032630070109918?project=apache-beam-testing
    Sep 19, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-19_11_45_07-12541032630070109918
    Sep 19, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-19_11_45_07-12541032630070109918
    Sep 19, 2021 6:45:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-19T18:45:10.752Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:45:16.403Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 19, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:45:17.162Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:45:17.203Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:45:17.241Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:45:17.317Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:45:17.352Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:45:17.387Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 19, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:45:17.741Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 19, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:45:17.820Z: Starting 5 workers in us-central1-a...
    Sep 19, 2021 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:45:39.325Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2021 6:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:45:56.154Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2021 6:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:46:22.128Z: Workers have started successfully.
    Sep 19, 2021 6:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:46:22.156Z: Workers have started successfully.
    Sep 19, 2021 6:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:46:49.822Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 19, 2021 6:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:46:49.927Z: Cleaning up.
    Sep 19, 2021 6:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:46:50.010Z: Stopping worker pool...
    Sep 19, 2021 6:49:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:49:06.639Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2021 6:49:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T18:49:06.690Z: Worker pool stopped.
    Sep 19, 2021 6:49:12 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-19_11_45_07-12541032630070109918 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 18942ca1-42c4-41c2-91fe-a5601972494c and timestamp: 2021-09-19T18:49:12.536000000Z:
                     Metric:                    Value:
                   read_time                     6.595
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2021 6:49:13 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 23.177 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 53s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/2yqb7d7vumxq4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2444

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2444/display/redirect>

Changes:


------------------------------------------
[...truncated 337.50 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 2e2b2489cf9d882242349157c9769a31
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 19, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 19, 2021 12:44:54 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 19, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 19, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@784892670]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 19, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1151017424]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 19, 2021 12:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 19, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2385287251749945531.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-biuKFtfxebcbszDzssfOEHX30Yc7RM-ZnvJd_N3pXX0.jar
    Sep 19, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 19, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash ec3685dd45af0a8234e46f2df0a7b05dd59e901163188197770e944b89a26c89> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7DaF3UWvCoI05G8t8KewXdWekBFjGIGXdw6US4mibIk.pb
    Sep 19, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 19, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 19, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 19, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-19_05_45_07-15457158388052488107?project=apache-beam-testing
    Sep 19, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-19_05_45_07-15457158388052488107
    Sep 19, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-19_05_45_07-15457158388052488107
    Sep 19, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-19T12:45:10.648Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:45:16.699Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 19, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:45:17.465Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:45:17.496Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:45:17.522Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:45:17.594Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:45:17.622Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:45:17.655Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 19, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:45:17.987Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 19, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:45:18.057Z: Starting 5 workers in us-central1-a...
    Sep 19, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:45:22.370Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:46:05.281Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:46:35.326Z: Workers have started successfully.
    Sep 19, 2021 12:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:46:35.354Z: Workers have started successfully.
    Sep 19, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:47:02.720Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 19, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:47:02.859Z: Cleaning up.
    Sep 19, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:47:02.920Z: Stopping worker pool...
    Sep 19, 2021 12:49:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:49:30.212Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2021 12:49:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T12:49:30.248Z: Worker pool stopped.
    Sep 19, 2021 12:49:35 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-19_05_45_07-15457158388052488107 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 946bdbb9-ad1a-4629-82a7-6aa6743acae4 and timestamp: 2021-09-19T12:49:35.342000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.918

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2021 12:49:35 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 46.36 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 17s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/4kjmt3h3hs2n4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2443

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2443/display/redirect>

Changes:


------------------------------------------
[...truncated 341.79 KB...]
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 19, 2021 6:45:52 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 19, 2021 6:45:54 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 19, 2021 6:45:54 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 19, 2021 6:45:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 6:45:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:46:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 6:46:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 6:46:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:46:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 6:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@127179475]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 19, 2021 6:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2021 6:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 6:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2021 6:46:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:46:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@868637325]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2021 6:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 19, 2021 6:46:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2021 6:46:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2021 6:46:10 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 19, 2021 6:46:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5363249692408335901.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-y-B-2DhdolkcxTsYTbZYx3rzcYpbpnp9bQwyiRUcOKM.jar
    Sep 19, 2021 6:46:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 19, 2021 6:46:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2021 6:46:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 978c84376bd852fc1fde085c4f5a152de5fe925360e243553909af8e8a35a2fa> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-l4yEN2vYUvwf3ghcT1oVLeX-klNg4kNVOQmvjoo1ovo.pb
    Sep 19, 2021 6:46:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2021 6:46:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 19, 2021 6:46:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 19, 2021 6:46:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 19, 2021 6:46:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-18_23_46_14-17703222293438978451?project=apache-beam-testing
    Sep 19, 2021 6:46:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-18_23_46_14-17703222293438978451
    Sep 19, 2021 6:46:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-18_23_46_14-17703222293438978451
    Sep 19, 2021 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-19T06:46:18.008Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2021 6:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:46:27.446Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 19, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:46:28.273Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:46:28.321Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:46:28.347Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:46:28.422Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:46:28.455Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:46:28.483Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 19, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:46:28.799Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 19, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:46:28.889Z: Starting 5 workers in us-central1-c...
    Sep 19, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:46:58.065Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2021 6:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:47:07.163Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 19, 2021 6:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:47:07.190Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 19, 2021 6:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:47:17.457Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2021 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:47:42.226Z: Workers have started successfully.
    Sep 19, 2021 6:47:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:47:42.260Z: Workers have started successfully.
    Sep 19, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:48:10.097Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 19, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:48:10.231Z: Cleaning up.
    Sep 19, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:48:10.310Z: Stopping worker pool...
    Sep 19, 2021 6:50:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:50:36.446Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2021 6:50:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T06:50:36.489Z: Worker pool stopped.
    Sep 19, 2021 6:50:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-18_23_46_14-17703222293438978451 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 70f996ce-5135-4209-aadf-ae60dd06f90f and timestamp: 2021-09-19T06:50:42.162000000Z:
                     Metric:                    Value:
                   read_time                     7.234
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2021 6:50:42 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 57.09 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/peudu53fkfzye

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Sun Sep 12 06:44:40 UTC 2021.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.83 secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2442

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2442/display/redirect>

Changes:


------------------------------------------
[...truncated 339.45 KB...]
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 19, 2021 12:44:52 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 19, 2021 12:44:52 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 19, 2021 12:44:53 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 19, 2021 12:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 12:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@784892670]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1151017424]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 19, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 19, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 19, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 19, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 19, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 19, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 19, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test659937586620404907.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SR_QdYYePgNPgrgs2medifMEAsLCBllgkgYMm5kJfGE.jar
    Sep 19, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Sep 19, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Sep 19, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Sep 19, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
    Sep 19, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 245 files cached, 5 files newly uploaded in 0 seconds
    Sep 19, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 19, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash 05ff7e70ecae048a27522839ee324f50718aa4f7a7a493521b27da3ef5c1505e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Bf9-cOyuBIonUig57jJPUHGKpPenpJNSGyfaPvXBUF4.pb
    Sep 19, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 19, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 19, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 19, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 19, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-18_17_45_05-4344567043694350364?project=apache-beam-testing
    Sep 19, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-18_17_45_05-4344567043694350364
    Sep 19, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-18_17_45_05-4344567043694350364
    Sep 19, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-19T00:45:08.994Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 19, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:16.034Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 19, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:16.877Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 19, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:16.910Z: Expanding GroupByKey operations into optimizable parts.
    Sep 19, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:16.934Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 19, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:16.995Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 19, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:17.032Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 19, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:17.072Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 19, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:17.440Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 19, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:17.521Z: Starting 5 workers in us-central1-c...
    Sep 19, 2021 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:37.426Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 19, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:50.546Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 19, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:45:50.578Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 19, 2021 12:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:46:00.851Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 19, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:46:26.464Z: Workers have started successfully.
    Sep 19, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:46:26.520Z: Workers have started successfully.
    Sep 19, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:46:58.172Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 19, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:46:58.534Z: Cleaning up.
    Sep 19, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:46:58.630Z: Stopping worker pool...
    Sep 19, 2021 12:49:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:49:14.155Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 19, 2021 12:49:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-19T00:49:14.228Z: Worker pool stopped.
    Sep 19, 2021 12:49:54 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-18_17_45_05-4344567043694350364 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0b513e5e-647a-4717-a010-2f08d542684d and timestamp: 2021-09-19T00:49:54.400000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.793

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 19, 2021 12:49:54 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 5 mins 6.486 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 37s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/bxeybgq7gulgm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2441

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2441/display/redirect>

Changes:


------------------------------------------
[...truncated 338.26 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 2e2b2489cf9d882242349157c9769a31
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 18, 2021 6:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 18, 2021 6:45:01 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 18, 2021 6:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 18, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@784892670]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1151017424]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 18, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2021 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 18, 2021 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5979530977339199524.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-UYkmvEai9tqk1NB18q5tE65KTYKyeQG_fPS1avJAKWo.jar
    Sep 18, 2021 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 18, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2021 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash ff9277464ca049e2305ccd0594126e954b43700e85cd983c4e244cbb4fe3ebc2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_5J3RkygSeIwXM0FlBJulUtDcA6FzZg8TiRMu0_j68I.pb
    Sep 18, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 18, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 18, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 18, 2021 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-18_11_45_15-16673422427586790898?project=apache-beam-testing
    Sep 18, 2021 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-18_11_45_15-16673422427586790898
    Sep 18, 2021 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-18_11_45_15-16673422427586790898
    Sep 18, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-18T18:45:19.387Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:45:25.862Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 18, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:45:26.788Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:45:26.822Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:45:26.845Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:45:26.909Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:45:26.940Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:45:26.974Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 18, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:45:27.319Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 18, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:45:27.388Z: Starting 5 workers in us-central1-c...
    Sep 18, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:45:44.268Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2021 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:46:11.422Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:46:37.715Z: Workers have started successfully.
    Sep 18, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:46:37.767Z: Workers have started successfully.
    Sep 18, 2021 6:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:47:07.698Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 18, 2021 6:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:47:07.845Z: Cleaning up.
    Sep 18, 2021 6:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:47:07.934Z: Stopping worker pool...
    Sep 18, 2021 6:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:49:24.022Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2021 6:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T18:49:24.058Z: Worker pool stopped.
    Sep 18, 2021 6:49:30 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-18_11_45_15-16673422427586790898 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2df1b947-7957-404a-a6bb-ecfb2f4fe2a4 and timestamp: 2021-09-18T18:49:30.288000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.433

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2021 6:49:31 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.038 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 35.478 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 14s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/4sgrafoyp3qho

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2440

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2440/display/redirect>

Changes:


------------------------------------------
[...truncated 338.63 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 2e2b2489cf9d882242349157c9769a31
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 18, 2021 12:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 18, 2021 12:45:00 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 18, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 18, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@784892670]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1151017424]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 18, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 18, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test508090196888883738.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-j8lI1z78mx1qReg8w20b9ImbvZUa94tSU87P0rkRTXs.jar
    Sep 18, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 18, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104004 bytes, hash c0595ef0cd231df1819f851e89b0e4f83ebe911b6543e7b429798027b86e9438> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wFle8M0jHfGBn4UeibDk-D6-kRtlQ-e0KXmAJ7hulDg.pb
    Sep 18, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 18, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 18, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 18, 2021 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-18_05_45_14-6325844803791487056?project=apache-beam-testing
    Sep 18, 2021 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-18_05_45_14-6325844803791487056
    Sep 18, 2021 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-18_05_45_14-6325844803791487056
    Sep 18, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-18T12:45:17.859Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:45:24.925Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:45:25.865Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:45:25.918Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:45:25.953Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:45:26.024Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:45:26.060Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:45:26.094Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:45:26.446Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:45:26.508Z: Starting 5 workers in us-central1-c...
    Sep 18, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:45:39.478Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:46:13.666Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2021 12:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:46:39.620Z: Workers have started successfully.
    Sep 18, 2021 12:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:46:39.644Z: Workers have started successfully.
    Sep 18, 2021 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:47:07.254Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 18, 2021 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:47:07.384Z: Cleaning up.
    Sep 18, 2021 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:47:07.463Z: Stopping worker pool...
    Sep 18, 2021 12:49:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:49:41.585Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2021 12:49:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T12:49:41.626Z: Worker pool stopped.
    Sep 18, 2021 12:49:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-18_05_45_14-6325844803791487056 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 058dd872-bbd4-4445-b23f-91500d0acd92 and timestamp: 2021-09-18T12:49:47.520000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.455

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2021 12:49:47 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 52.004 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 27s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/wrv4jwmeuvrfw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2439

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2439/display/redirect>

Changes:


------------------------------------------
[...truncated 339.46 KB...]
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 2e2b2489cf9d882242349157c9769a31
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 18, 2021 6:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 18, 2021 6:44:59 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 18, 2021 6:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 18, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@945711342]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1394176916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 18, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 18, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4040703456648660585.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2ZQ1RyJ5WY-qkJE4NhYl5MZjBgQtRsIm40FC8zwqkmM.jar
    Sep 18, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.69/6cb063f1d527ff65bdbb9ea74888a5ffc3f92197/fastjson-1.2.69.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.69-KniRdEoDeOAJmpuB3b7U0uGn6GMvb4b4_8-jg-UeScg.jar
    Sep 18, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 248 files cached, 2 files newly uploaded in 0 seconds
    Sep 18, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 148d6269386e03b8acc83a543743e9acd29909f92967069cc060211c436c7640> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-FI1iaThuA7isyDpUN0PprNKZCfkpZwacwGAhHENsdkA.pb
    Sep 18, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 18, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 18, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 18, 2021 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-17_23_45_13-8888735768678419727?project=apache-beam-testing
    Sep 18, 2021 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-17_23_45_13-8888735768678419727
    Sep 18, 2021 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-17_23_45_13-8888735768678419727
    Sep 18, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-18T06:45:19.954Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:45:25.576Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 18, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:45:26.327Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:45:26.358Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:45:26.385Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:45:26.434Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:45:26.460Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:45:26.485Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 18, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:45:26.803Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 18, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:45:26.876Z: Starting 5 workers in us-central1-a...
    Sep 18, 2021 6:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:45:53.746Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2021 6:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:46:13.043Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2021 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:46:38.824Z: Workers have started successfully.
    Sep 18, 2021 6:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:46:38.850Z: Workers have started successfully.
    Sep 18, 2021 6:47:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:47:06.775Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 18, 2021 6:47:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:47:06.897Z: Cleaning up.
    Sep 18, 2021 6:47:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:47:06.976Z: Stopping worker pool...
    Sep 18, 2021 6:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:49:26.810Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2021 6:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T06:49:26.856Z: Worker pool stopped.
    Sep 18, 2021 6:49:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-17_23_45_13-8888735768678419727 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9ff13ac3-1a6e-4c0c-a574-0adff4797089 and timestamp: 2021-09-18T06:49:32.795000000Z:
                     Metric:                    Value:
                   read_time                     8.761
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2021 6:49:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 39.42 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 11s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/ioqdzwgbbysx6

Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not accessed on or after Sat Sep 11 06:44:26 UTC 2021.
Invalidating in-memory cache of /home/jenkins/.gradle/caches/journal-1/file-access.bin
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 0.341 secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2438

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2438/display/redirect?page=changes>

Changes:

[noreply] Minor: Prune docker volumes in Inventory job(#15532)


------------------------------------------
[...truncated 340.14 KB...]
> Task :sdks:java:extensions:sql:perf-tests:testClasses
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) started.
Gradle Test Executor 2 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 2e2b2489cf9d882242349157c9769a31
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 18, 2021 12:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 18, 2021 12:45:20 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 18, 2021 12:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 18, 2021 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 12:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 12:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@945711342]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1394176916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 18, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 18, 2021 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 18, 2021 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 18, 2021 12:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 18, 2021 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 18, 2021 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 18, 2021 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5940345697793417465.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-T8rrH2TELNbuh4XgaDr5Y23V20HI5R9g9FbPaOVIv30.jar
    Sep 18, 2021 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 18, 2021 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 18, 2021 12:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 96581bf42c7128cd2a50f61fe3911b17cca1e72d224acb344417760ce6e5ebb6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-llgb9CxxKM0qUPYf45EbF8yh5y0iSss0RBd2DObl67Y.pb
    Sep 18, 2021 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 18, 2021 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 18, 2021 12:45:34 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 18, 2021 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 18, 2021 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-17_17_45_35-301988535343975277?project=apache-beam-testing
    Sep 18, 2021 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-17_17_45_35-301988535343975277
    Sep 18, 2021 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-17_17_45_35-301988535343975277
    Sep 18, 2021 12:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-18T00:45:38.481Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 18, 2021 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:45:48.886Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 18, 2021 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:45:49.838Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 18, 2021 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:45:49.912Z: Expanding GroupByKey operations into optimizable parts.
    Sep 18, 2021 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:45:49.945Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 18, 2021 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:45:50.014Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 18, 2021 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:45:50.038Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 18, 2021 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:45:50.066Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 18, 2021 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:45:50.438Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 18, 2021 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:45:50.832Z: Starting 5 workers in us-central1-c...
    Sep 18, 2021 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:46:12.224Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 18, 2021 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:46:35.610Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 18, 2021 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:47:01.755Z: Workers have started successfully.
    Sep 18, 2021 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:47:01.792Z: Workers have started successfully.
    Sep 18, 2021 12:47:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:47:31.372Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 18, 2021 12:47:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:47:31.504Z: Cleaning up.
    Sep 18, 2021 12:47:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:47:31.614Z: Stopping worker pool...
    Sep 18, 2021 12:49:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:49:53.807Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 18, 2021 12:49:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-18T00:49:53.850Z: Worker pool stopped.
    Sep 18, 2021 12:50:00 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-17_17_45_35-301988535343975277 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5054a4b3-af77-46fa-95ea-0d11e7efa0d0 and timestamp: 2021-09-18T00:50:00.579000000Z:
                     Metric:                    Value:
                   read_time                     8.551
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 18, 2021 12:50:01 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 45.26 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 38s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/7ronkkvanjuec

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2437

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2437/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-12740] Add option to CreateOptions to avoid GetObjectMetadata for


------------------------------------------
[...truncated 344.71 KB...]
Gradle Test Executor 3 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 2e2b2489cf9d882242349157c9769a31
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 3'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 3'
Successfully started process 'Gradle Test Executor 3'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 17, 2021 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 17, 2021 6:45:25 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 17, 2021 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 17, 2021 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 6:45:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@945711342]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1394176916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 6:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2021 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 17, 2021 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2021 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2021 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 17, 2021 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8192769708742958485.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test--eo5hjTkA_gOYJj5pPo8Vv-4OWVNuJNTdDF9HuJYPOA.jar
    Sep 17, 2021 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 17, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2021 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 8bf3e6103fdb273470e3467c8f94bb2295d81284598b0a5d04cac59d23f8142f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-i_PmED_bJzRw40Z8j5S7IpXYEoRZiwpdBMrFnSP4FC8.pb
    Sep 17, 2021 6:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2021 6:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 17, 2021 6:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 17, 2021 6:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 17, 2021 6:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-17_11_45_39-6540985896134660808?project=apache-beam-testing
    Sep 17, 2021 6:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-17_11_45_39-6540985896134660808
    Sep 17, 2021 6:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-17_11_45_39-6540985896134660808
    Sep 17, 2021 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-17T18:45:42.635Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2021 6:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:45:49.060Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 17, 2021 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:45:49.854Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2021 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:45:49.891Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2021 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:45:49.920Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2021 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:45:50.004Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2021 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:45:50.074Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2021 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:45:50.146Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 17, 2021 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:45:50.547Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 17, 2021 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:45:50.626Z: Starting 5 workers in us-central1-c...
    Sep 17, 2021 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:46:11.576Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2021 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:46:24.034Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 17, 2021 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:46:24.069Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 17, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:46:34.376Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2021 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:46:58.408Z: Workers have started successfully.
    Sep 17, 2021 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:46:58.490Z: Workers have started successfully.
    Sep 17, 2021 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:47:30.285Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 17, 2021 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:47:30.400Z: Cleaning up.
    Sep 17, 2021 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:47:30.470Z: Stopping worker pool...
    Sep 17, 2021 6:49:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:49:54.511Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2021 6:49:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T18:49:54.541Z: Worker pool stopped.
    Sep 17, 2021 6:50:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-17_11_45_39-6540985896134660808 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 70ab0286-aff4-4587-b427-647593c2fb91 and timestamp: 2021-09-17T18:50:02.353000000Z:
                     Metric:                    Value:
                   read_time                    11.039
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2021 6:50:03 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 42.967 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 43s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/hzjxpuraomcyk

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2436

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2436/display/redirect>

Changes:


------------------------------------------
[...truncated 337.56 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 91a68d30729b97c362c4d1b5908c850f
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 17, 2021 12:44:55 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 17, 2021 12:44:56 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 17, 2021 12:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 17, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@977492762]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@271342901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 17, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 17, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1363837139006503109.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2TjWzeVf5-wlCDKk8P0aezQ9m2xzP9nubgCXb4-Ui4o.jar
    Sep 17, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 6 seconds
    Sep 17, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 274c09b88f0e490b31e4e90aadacd3736b384135a75ba8c0c86dcc33e3f9a73d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-J0wJuI8OSQsx5OkKrazTc2s4QTWnW6jAyG3MM-P5pz0.pb
    Sep 17, 2021 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2021 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 17, 2021 12:45:15 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 17, 2021 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-17_05_45_16-15022222221004330035?project=apache-beam-testing
    Sep 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-17_05_45_16-15022222221004330035
    Sep 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-17_05_45_16-15022222221004330035
    Sep 17, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-17T12:45:19.753Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:45:31.791Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 17, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:45:32.606Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:45:32.643Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:45:32.675Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:45:32.757Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:45:32.792Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:45:32.827Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 17, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:45:33.146Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 17, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:45:33.236Z: Starting 5 workers in us-central1-c...
    Sep 17, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:45:52.877Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:46:06.222Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 17, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:46:06.247Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 17, 2021 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:46:16.517Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2021 12:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:46:41.355Z: Workers have started successfully.
    Sep 17, 2021 12:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:46:41.400Z: Workers have started successfully.
    Sep 17, 2021 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:47:11.540Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 17, 2021 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:47:11.666Z: Cleaning up.
    Sep 17, 2021 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:47:11.729Z: Stopping worker pool...
    Sep 17, 2021 12:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:49:30.038Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2021 12:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T12:49:30.078Z: Worker pool stopped.
    Sep 17, 2021 12:49:35 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-17_05_45_16-15022222221004330035 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b13ac26c-8f0d-4bde-ae06-cc152641018b and timestamp: 2021-09-17T12:49:35.883000000Z:
                     Metric:                    Value:
                   read_time                     8.328
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2021 12:49:36 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 44.45 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 17s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/3qjiegdf266wg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2435

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2435/display/redirect>

Changes:


------------------------------------------
[...truncated 337.61 KB...]
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 91a68d30729b97c362c4d1b5908c850f
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 17, 2021 6:44:55 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 17, 2021 6:44:55 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 17, 2021 6:44:56 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 17, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@323141774]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1176308843]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 17, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 17, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2550726164129370281.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-deWHtMR5wmzwPK5pnKKjd5F6Ir4u_pC0SMdGES9dNTo.jar
    Sep 17, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_26_0/0.1/5fae4e97a2d8739462bd1572e48d01228766b6ef/beam-vendor-calcite-1_26_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_26_0-0.1-pYZ7esxRWyhKmBqBdfrpnxvg8woyykTvGbaCvLtyRyA.jar
    Sep 17, 2021 6:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 248 files cached, 2 files newly uploaded in 1 seconds
    Sep 17, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2021 6:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash 5f0a7955fc72cd956d32fe88bca969cb83e2048471bafdb8015edee863e8a5c1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Xwp5VfxyzZVtMv6IvKlpy4PiBIRxuv24AV7e6GPopcE.pb
    Sep 17, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 17, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 17, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 17, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-16_23_45_12-4141909358847894549?project=apache-beam-testing
    Sep 17, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-16_23_45_12-4141909358847894549
    Sep 17, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-16_23_45_12-4141909358847894549
    Sep 17, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-17T06:45:15.494Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:45:22.468Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 17, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:45:23.147Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:45:23.188Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:45:23.218Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:45:23.282Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:45:23.307Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:45:23.336Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 17, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:45:23.671Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 17, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:45:23.755Z: Starting 5 workers in us-central1-a...
    Sep 17, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:45:45.289Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2021 6:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:46:04.814Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2021 6:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:46:30.897Z: Workers have started successfully.
    Sep 17, 2021 6:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:46:30.922Z: Workers have started successfully.
    Sep 17, 2021 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:46:58.993Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 17, 2021 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:46:59.140Z: Cleaning up.
    Sep 17, 2021 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:46:59.218Z: Stopping worker pool...
    Sep 17, 2021 6:49:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:49:21.759Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2021 6:49:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T06:49:21.805Z: Worker pool stopped.
    Sep 17, 2021 6:49:28 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-16_23_45_12-4141909358847894549 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 159bfd6e-8072-4819-89bf-30c1e33f1379 and timestamp: 2021-09-17T06:49:28.583000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.111

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2021 6:49:29 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.043 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 37.912 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/f75ktoyzyggec

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2434

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2434/display/redirect?page=changes>

Changes:

[zyichi] [BEAM-12603] Add retries to FnApiRunnerTest due to flakiness of grpc

[noreply] [BEAM-12535] add dataframes notebook (#15470)


------------------------------------------
[...truncated 336.31 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 91a68d30729b97c362c4d1b5908c850f
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 17, 2021 12:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 17, 2021 12:44:58 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 17, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 17, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@977492762]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@271342901]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 17, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 17, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 17, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 17, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 17, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 17, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 17, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6936447019475044792.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-X9fphTc1zSRa_g-Bj2ZZLxeVgEiQq_nRNvSccdZ1mEo.jar
    Sep 17, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 17, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 17, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 75107f9df1418335b630d007da2d3678d6db15dfc5e3a4c2d164dab9f9ff7041> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-dRB_nfFBgzW2MNAH2i02eNbbFd_F46TC0WTaufn_cEE.pb
    Sep 17, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 17, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 17, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 17, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 17, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-16_17_45_13-13748401065240882438?project=apache-beam-testing
    Sep 17, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-16_17_45_13-13748401065240882438
    Sep 17, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-16_17_45_13-13748401065240882438
    Sep 17, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-17T00:45:16.283Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 17, 2021 12:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:45:23.434Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 17, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:45:24.230Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 17, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:45:24.267Z: Expanding GroupByKey operations into optimizable parts.
    Sep 17, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:45:24.301Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 17, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:45:24.399Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 17, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:45:24.429Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 17, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:45:24.476Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 17, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:45:24.845Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 17, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:45:24.918Z: Starting 5 workers in us-central1-a...
    Sep 17, 2021 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:45:38.937Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 17, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:46:08.175Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 17, 2021 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:46:34.385Z: Workers have started successfully.
    Sep 17, 2021 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:46:34.423Z: Workers have started successfully.
    Sep 17, 2021 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:47:02.481Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 17, 2021 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:47:02.649Z: Cleaning up.
    Sep 17, 2021 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:47:02.738Z: Stopping worker pool...
    Sep 17, 2021 12:49:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:49:29.200Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 17, 2021 12:49:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-17T00:49:29.281Z: Worker pool stopped.
    Sep 17, 2021 12:49:55 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-16_17_45_13-13748401065240882438 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7b0d8838-553a-4bfb-9641-87991658ade7 and timestamp: 2021-09-17T00:49:55.837000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     6.994

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 17, 2021 12:49:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 5 mins 2.907 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 37s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/cj5di3lxo7w7g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2433

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2433/display/redirect?page=changes>

Changes:

[randomstep] [BEAM-12899] Upgrade Gradle to version 6.9.x

[noreply] [BEAM-12701] Added extra parameter in to_csv for DeferredFrame to name


------------------------------------------
[...truncated 361.32 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 91a68d30729b97c362c4d1b5908c850f
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 5'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 5'
Successfully started process 'Gradle Test Executor 5'

Gradle Test Executor 5 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 16, 2021 6:49:35 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 16, 2021 6:49:38 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 16, 2021 6:49:39 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 16, 2021 6:49:49 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 6:49:49 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:49:53 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 6:49:53 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 6:49:53 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:49:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 6:49:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1125485903]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2021 6:49:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2021 6:49:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:49:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 6:49:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2021 6:49:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:49:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 6:49:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1764687859]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2021 6:49:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 6:49:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:49:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 6:49:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 6:49:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:49:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 6:49:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2021 6:49:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 16, 2021 6:50:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2021 6:50:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2021 6:50:05 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 16, 2021 6:50:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6402661844907658558.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-WvfikpCOqWWHTNuZNBQ4s8fYjx9nGhHU8VQ3zON4Euk.jar
    Sep 16, 2021 6:50:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 3 seconds
    Sep 16, 2021 6:50:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2021 6:50:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash ae7b5328ca459a1762cece2fd3ff70d5ea7f4a372fbf3966ab0c0c544a50ac02> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-rntTKMpFmhdizs4v0_9w1ep_SjcvvzlmqwwMVEpQrAI.pb
    Sep 16, 2021 6:50:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2021 6:50:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 16, 2021 6:50:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 16, 2021 6:50:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 16, 2021 6:50:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-16_11_50_14-4695151237778241422?project=apache-beam-testing
    Sep 16, 2021 6:50:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-16_11_50_14-4695151237778241422
    Sep 16, 2021 6:50:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-16_11_50_14-4695151237778241422
    Sep 16, 2021 6:50:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-16T18:50:23.675Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2021 6:50:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:50:37.850Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 16, 2021 6:50:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:50:39.155Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2021 6:50:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:50:39.481Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2021 6:50:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:50:39.749Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2021 6:50:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:50:40.173Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2021 6:50:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:50:40.266Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2021 6:50:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:50:40.366Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 16, 2021 6:50:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:50:41.650Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 16, 2021 6:50:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:50:41.793Z: Starting 5 workers in us-central1-c...
    Sep 16, 2021 6:51:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:51:06.017Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2021 6:51:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:51:23.104Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2021 6:51:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:51:52.946Z: Workers have started successfully.
    Sep 16, 2021 6:51:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:51:52.984Z: Workers have started successfully.
    Sep 16, 2021 6:52:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:52:40.453Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 16, 2021 6:52:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:52:41.990Z: Cleaning up.
    Sep 16, 2021 6:52:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:52:43.347Z: Stopping worker pool...
    Sep 16, 2021 6:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:55:08.150Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2021 6:55:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T18:55:08.367Z: Worker pool stopped.
    Sep 16, 2021 6:55:21 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-16_11_50_14-4695151237778241422 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1416c547-bf94-4053-ad53-9a220c44c91b and timestamp: 2021-09-16T18:55:21.909000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    16.753

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2021 6:55:23 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 5 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 5 mins 59.596 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 50s
152 actionable tasks: 106 executed, 46 from cache

Publishing build scan...
https://gradle.com/s/hhbkuvdcjigp4

Stopped 4 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2432

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2432/display/redirect>

Changes:


------------------------------------------
[...truncated 341.16 KB...]
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 16, 2021 12:44:57 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 16, 2021 12:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 16, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@799287534]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1883029333]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 16, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 16, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5057746223146241907.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XOEh0C7VSvwz47bI8STE9p8ozRWqb3k7imOFA5KlQ6c.jar
    Sep 16, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.15.1/91e6dfab8f141f77c6a0dd147a94bd186993a22c/testcontainers-1.15.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.15.1-_fMrLpRXSowMnAISRc73bMk5UqrXNnsv6vkPXOZRXu0.jar
    Sep 16, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.15.1/ccd374ca7e5d8b06fb31ecffeb03abc42feada84/kafka-1.15.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.15.1-Z4qZ3jdWXDvdB7xD9VivkdVQiRZWNWUhPSIANVCSZDA.jar
    Sep 16, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-api/3.2.7/81408fc988c229ea11354fee9902c47842343f04/docker-java-api-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-api-3.2.7-AwtXDacySf3gUHoixVjjNqW0wUI0YcGJjMO-N7kxBho.jar
    Sep 16, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Sep 16, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Sep 16, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport-zerodep/3.2.7/d74cb05fabc57e9ec441dc39d0bc58ad649fad3d/docker-java-transport-zerodep-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-zerodep-3.2.7-O2a9rUb-899yDZMu4E8j9ItFOkNoggqbLP2xaLmOwJY.jar
    Sep 16, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Sep 16, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Sep 16, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Sep 16, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport/3.2.7/315903a129f530422747efc163dd255f0fa2555e/docker-java-transport-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-3.2.7-ynNXlSXhDSyiLCXkA4c9E9_HGeKB7-wbpEzmswFrI-A.jar
    Sep 16, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 239 files cached, 11 files newly uploaded in 1 seconds
    Sep 16, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 513c87cfb263bd332e090c542d31bdc8d487fa52b98bf052789417abfb1c7fa1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-UTyHz7JjvTMuCQxULTG9yNSH-lK5i_BSeJQXq_scf6E.pb
    Sep 16, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 16, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 16, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 16, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-16_05_45_11-14919211802293176260?project=apache-beam-testing
    Sep 16, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-16_05_45_11-14919211802293176260
    Sep 16, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-16_05_45_11-14919211802293176260
    Sep 16, 2021 12:45:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-16T12:45:14.700Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:45:29.214Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 16, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:45:30.067Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:45:30.152Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:45:30.208Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:45:30.322Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:45:30.370Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:45:30.412Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 16, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:45:31.081Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 16, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:45:31.225Z: Starting 5 workers in us-central1-c...
    Sep 16, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:46:00.219Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:46:05.286Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 16, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:46:05.353Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 16, 2021 12:46:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:46:15.704Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2021 12:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:46:41.005Z: Workers have started successfully.
    Sep 16, 2021 12:46:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:46:41.051Z: Workers have started successfully.
    Sep 16, 2021 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:47:16.525Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 16, 2021 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:47:16.787Z: Cleaning up.
    Sep 16, 2021 12:47:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:47:16.896Z: Stopping worker pool...
    Sep 16, 2021 12:49:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:49:41.156Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2021 12:49:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T12:49:41.249Z: Worker pool stopped.
    Sep 16, 2021 12:49:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-16_05_45_11-14919211802293176260 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 66159334-5e5e-4e13-bfce-f9f802171d10 and timestamp: 2021-09-16T12:49:48.939000000Z:
                     Metric:                    Value:
                   read_time                    13.629
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2021 12:49:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 56.972 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 30s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/opq4x3jtu6pf2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2431

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2431/display/redirect?page=changes>

Changes:

[ajamato] [BEAM-12898] Disable Flink Load tests which are leading Dataproc


------------------------------------------
[...truncated 338.56 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 2277531f8cb1c01e0194391663a4e766
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 16, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 16, 2021 6:45:07 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 16, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 16, 2021 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 6:45:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 6:45:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:45:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@213658433]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2038704487]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 16, 2021 6:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2501984044205675756.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-DkuTZqyrOHQzTTH2U7X2_3KF5Ys2D8E_rncb0cli_rE.jar
    Sep 16, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 16, 2021 6:45:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 1f620931db966be012c886243848163cd2508005904362d253b847356f4483a8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-H2IJMduWa-ASyIYkOEgWPNJQgAWQQ2LSU7hHNW9Eg6g.pb
    Sep 16, 2021 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2021 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 16, 2021 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 16, 2021 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 16, 2021 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-15_23_45_22-3612431813213432821?project=apache-beam-testing
    Sep 16, 2021 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-15_23_45_22-3612431813213432821
    Sep 16, 2021 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-15_23_45_22-3612431813213432821
    Sep 16, 2021 6:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-16T06:45:25.657Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2021 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:45:34.316Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 16, 2021 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:45:35.075Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2021 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:45:35.148Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2021 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:45:35.208Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2021 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:45:35.340Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2021 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:45:35.382Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2021 6:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:45:35.425Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 16, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:45:36.015Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 16, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:45:36.146Z: Starting 5 workers in us-central1-c...
    Sep 16, 2021 6:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:45:50.433Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2021 6:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:46:07.302Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 16, 2021 6:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:46:07.355Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 16, 2021 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:46:17.620Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2021 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:46:42.606Z: Workers have started successfully.
    Sep 16, 2021 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:46:42.647Z: Workers have started successfully.
    Sep 16, 2021 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:47:14.950Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 16, 2021 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:47:15.132Z: Cleaning up.
    Sep 16, 2021 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:47:15.291Z: Stopping worker pool...
    Sep 16, 2021 6:49:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:49:27.384Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2021 6:49:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T06:49:27.439Z: Worker pool stopped.
    Sep 16, 2021 6:49:35 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-15_23_45_22-3612431813213432821 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 39606f97-94cd-4f1d-b25a-62181c45ba1c and timestamp: 2021-09-16T06:49:35.095000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.362

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2021 6:49:36 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 34.374 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 12s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/qz3iavuu2quwu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2430

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2430/display/redirect?page=changes>

Changes:

[dpcollins] [BEAM-12882] - fix test that is flaky when jenkins is overloaded

[noreply] [BEAM-12543] Fix DataFrrame typo (#15509)

[noreply] [BEAM-12794] Remove obsolete uses of sys.exc_info. (#15507)

[noreply] [BEAM-11666]  flake on RecordingManagerTest (#15118)

[kawaigin] [BEAM-10708] Introspect beam_sql output

[noreply] Minor: Restore "Bugfix" section in CHANGES.md (#15516)

[Kyle Weaver] [BEAM-10459] Unignore numeric aggregation tests.


------------------------------------------
[...truncated 353.41 KB...]
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 2277531f8cb1c01e0194391663a4e766
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

Gradle Test Executor 2 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 16, 2021 12:50:05 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 16, 2021 12:50:08 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 16, 2021 12:50:10 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 16, 2021 12:50:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 12:50:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:50:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 12:50:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 12:50:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:50:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 12:50:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1711003349]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 16, 2021 12:50:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2021 12:50:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:50:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 12:50:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 16, 2021 12:50:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:50:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 12:50:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1656216931]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2021 12:50:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 12:50:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:50:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 16, 2021 12:50:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 16, 2021 12:50:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 16, 2021 12:50:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 16, 2021 12:50:37 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 16, 2021 12:50:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 16, 2021 12:50:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 16, 2021 12:50:46 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 16, 2021 12:50:46 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 16, 2021 12:50:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5890461711699957094.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-PRJcv0kQfXeu_NbA-hs-AwOOn27aWtugRvDgDA0K7og.jar
    Sep 16, 2021 12:50:52 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 5 seconds
    Sep 16, 2021 12:50:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 16, 2021 12:50:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 08a72176caefa466e2c4b3e58f0e878f9e355fe082139f4ab7a6c60171593fe6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-CKchdsrvpGbixLPljw6Hj541X-CCE59Kt6bGAXFZP-Y.pb
    Sep 16, 2021 12:50:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 16, 2021 12:50:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 16, 2021 12:50:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 16, 2021 12:51:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 16, 2021 12:51:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-15_17_51_00-16927309341686112669?project=apache-beam-testing
    Sep 16, 2021 12:51:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-15_17_51_00-16927309341686112669
    Sep 16, 2021 12:51:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-15_17_51_00-16927309341686112669
    Sep 16, 2021 12:51:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-16T00:51:04.087Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 16, 2021 12:51:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:11.725Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 16, 2021 12:51:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:12.495Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 16, 2021 12:51:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:12.543Z: Expanding GroupByKey operations into optimizable parts.
    Sep 16, 2021 12:51:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:12.584Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 16, 2021 12:51:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:12.697Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 16, 2021 12:51:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:12.736Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 16, 2021 12:51:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:12.770Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 16, 2021 12:51:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:13.230Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 16, 2021 12:51:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:13.324Z: Starting 5 workers in us-central1-c...
    Sep 16, 2021 12:51:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:23.628Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 16, 2021 12:51:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:43.204Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 16, 2021 12:51:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:43.243Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 16, 2021 12:51:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:51:53.545Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 16, 2021 12:52:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:52:32.723Z: Workers have started successfully.
    Sep 16, 2021 12:52:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:52:32.759Z: Workers have started successfully.
    Sep 16, 2021 12:53:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:53:06.315Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 16, 2021 12:53:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:53:06.520Z: Cleaning up.
    Sep 16, 2021 12:53:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:53:06.665Z: Stopping worker pool...
    Sep 16, 2021 12:55:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:55:21.681Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 16, 2021 12:55:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-16T00:55:21.727Z: Worker pool stopped.
    Sep 16, 2021 12:55:27 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-15_17_51_00-16927309341686112669 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b2d20870-6a8d-4553-8af9-d5f8131a9cf8 and timestamp: 2021-09-16T00:55:27.632000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.357

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 16, 2021 12:55:28 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 5 mins 36.102 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 10s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/uhngoaqikozim

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2429

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2429/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12885] Enable NeedsRunner Tests for Samza Portable Runner (#15512)

[noreply] [BEAM-12100][BEAM-10379][BEAM-9514][BEAM-12647][BEAM-12099]


------------------------------------------
[...truncated 382.00 KB...]
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 7c8314d0b1ab331a864b535bbeee5df5
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 29'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 29'
Successfully started process 'Gradle Test Executor 29'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 15, 2021 6:57:54 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 15, 2021 6:57:55 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 15, 2021 6:57:55 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 15, 2021 6:57:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 6:57:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:57:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 6:57:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 6:57:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:57:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 6:57:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@213658433]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@280858478]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:58:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 6:58:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2021 6:58:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 15, 2021 6:58:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2021 6:58:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2021 6:58:06 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 15, 2021 6:58:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests-jNBHRplzujunR2ca_W33DKopb0I1N66c2TzlUZA6-A8.jar
    Sep 15, 2021 6:58:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-d_4G4rn4Rll5Aje4hNipJhUiscPvjW925Yx4SqBJtnY.jar
    Sep 15, 2021 6:58:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3754504656764621449.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0XwWZp4LZjZJ7MBWiORdn18Au1rd_adIdZv51viV4cc.jar
    Sep 15, 2021 6:58:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 3 files newly uploaded in 0 seconds
    Sep 15, 2021 6:58:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2021 6:58:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 6c1ac708e8e335b1f27f858cb7b4abb8c202645bef25a08f3344df26896988d6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-bBrHCOjjNbHyf4WMt7SruMICZFvvJaCPM0TfJolpiNY.pb
    Sep 15, 2021 6:58:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2021 6:58:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 15, 2021 6:58:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 15, 2021 6:58:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 15, 2021 6:58:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-15_11_58_09-1410383782212129160?project=apache-beam-testing
    Sep 15, 2021 6:58:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-15_11_58_09-1410383782212129160
    Sep 15, 2021 6:58:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-15_11_58_09-1410383782212129160
    Sep 15, 2021 6:58:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-15T18:58:13.447Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2021 6:58:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:58:21.157Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 15, 2021 6:58:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:58:21.938Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2021 6:58:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:58:21.968Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2021 6:58:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:58:21.988Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2021 6:58:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:58:22.054Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2021 6:58:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:58:22.082Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2021 6:58:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:58:22.126Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 15, 2021 6:58:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:58:22.471Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 15, 2021 6:58:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:58:22.544Z: Starting 5 workers in us-central1-a...
    Sep 15, 2021 6:58:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:58:53.924Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2021 6:59:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:59:18.647Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 15, 2021 6:59:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:59:18.672Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 15, 2021 6:59:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:59:41.718Z: Workers have started successfully.
    Sep 15, 2021 6:59:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:59:41.749Z: Workers have started successfully.
    Sep 15, 2021 6:59:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T18:59:50.070Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2021 7:00:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T19:00:12.344Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 15, 2021 7:00:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T19:00:12.478Z: Cleaning up.
    Sep 15, 2021 7:00:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T19:00:12.548Z: Stopping worker pool...
    Sep 15, 2021 7:02:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T19:02:40.064Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2021 7:02:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T19:02:40.123Z: Worker pool stopped.
    Sep 15, 2021 7:02:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-15_11_58_09-1410383782212129160 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 626a92d7-cf90-4df2-a8b6-dc39b613242c and timestamp: 2021-09-15T19:02:47.866000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                      8.64

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2021 7:02:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 29 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.006 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.005 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 57.531 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 18m 28s
152 actionable tasks: 136 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/shar7cztvl3es

Stopped 11 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2428

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2428/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12153] revert "implement GroupByKey with CombinePerKey with

[noreply] [BEAM-12845] Add AWS services as a runtime dependency to Spark Job


------------------------------------------
[...truncated 358.36 KB...]
> Task :sdks:java:extensions:sql:perf-tests:testClasses
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) started.
Gradle Test Executor 6 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 87fb8cbd9bf06acd483f26e1ccaafa5a
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 6'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 6'
Successfully started process 'Gradle Test Executor 6'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 15, 2021 12:47:25 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 15, 2021 12:47:26 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 15, 2021 12:47:26 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 15, 2021 12:47:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 12:47:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:47:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 12:47:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 12:47:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:47:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 12:47:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2021 12:47:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2021 12:47:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:47:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 12:47:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2021 12:47:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:47:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 12:47:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2021 12:47:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 12:47:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:47:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 12:47:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 12:47:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:47:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 12:47:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2021 12:47:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 15, 2021 12:47:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2021 12:47:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2021 12:47:36 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 15, 2021 12:47:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4874788525819239956.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-gTwDvKiHwPyHLjgQ_xJVguy4PPe8vdUuQ6_w78pCYXs.jar
    Sep 15, 2021 12:47:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 15, 2021 12:47:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2021 12:47:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 32fd05be82c3d3d46105e89e00b53042f1c5f6c5468481c07401c0414c1973d8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Mv0FvoLD09RhBeieALUwQvHF9sVGhIHAdAHAQUwZc9g.pb
    Sep 15, 2021 12:47:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2021 12:47:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 15, 2021 12:47:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 15, 2021 12:47:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 15, 2021 12:47:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-15_05_47_39-14570986282291584778?project=apache-beam-testing
    Sep 15, 2021 12:47:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-15_05_47_39-14570986282291584778
    Sep 15, 2021 12:47:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-15_05_47_39-14570986282291584778
    Sep 15, 2021 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-15T12:47:43.371Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2021 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:47:47.002Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 15, 2021 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:47:47.704Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2021 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:47:47.790Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2021 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:47:47.843Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2021 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:47:48.072Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2021 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:47:48.121Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2021 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:47:48.163Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 15, 2021 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:47:48.502Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 15, 2021 12:47:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:47:48.572Z: Starting 5 workers in us-central1-a...
    Sep 15, 2021 12:48:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:48:22.043Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2021 12:48:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:48:34.256Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2021 12:49:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:49:01.372Z: Workers have started successfully.
    Sep 15, 2021 12:49:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:49:01.403Z: Workers have started successfully.
    Sep 15, 2021 12:49:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:49:32.680Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 15, 2021 12:49:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:49:32.824Z: Cleaning up.
    Sep 15, 2021 12:49:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:49:32.911Z: Stopping worker pool...
    Sep 15, 2021 12:51:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:51:59.347Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2021 12:51:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T12:51:59.387Z: Worker pool stopped.
    Sep 15, 2021 12:52:06 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-15_05_47_39-14570986282291584778 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1d5598e4-5eac-48c0-a20b-bba8e69d4655 and timestamp: 2021-09-15T12:52:06.445000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.414

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2021 12:52:06 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 6 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 45.209 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 45s
152 actionable tasks: 112 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/yq4jfemvsxskm

Stopped 5 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2427

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2427/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12876] Adding doc and glossary entry for resource hints (#15499)


------------------------------------------
[...truncated 338.45 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 50340803db392ee5f0c7210f39a33b6d
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 15, 2021 6:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 15, 2021 6:45:01 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 15, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 15, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 6:45:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@213658433]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2021 6:45:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2021 6:45:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2038704487]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 6:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 6:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2021 6:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 15, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2021 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2021 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 15, 2021 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1508267424378395638.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-t-GsRI2oOmC8XMemse_AYlD077VmtZNdXz46aZbMS2U.jar
    Sep 15, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 15, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 3e4609834e00ee97ce09fe3ebb283464c8fd45a84ae3028311cb2f2f90e1d684> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-PkYJg04A7pfOCf4-uyg0ZMj9RahK4wKDEcsvL5Dh1oQ.pb
    Sep 15, 2021 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2021 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 15, 2021 6:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 15, 2021 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 15, 2021 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-14_23_45_16-6956103674034291982?project=apache-beam-testing
    Sep 15, 2021 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-14_23_45_16-6956103674034291982
    Sep 15, 2021 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-14_23_45_16-6956103674034291982
    Sep 15, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-15T06:45:19.649Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:26.521Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 15, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:27.445Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:27.477Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:27.520Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:27.595Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:27.632Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:27.666Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 15, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:27.992Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 15, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:28.090Z: Starting 5 workers in us-central1-c...
    Sep 15, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:36.135Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2021 6:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:58.340Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 15, 2021 6:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:45:58.364Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 15, 2021 6:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:46:08.569Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2021 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:46:32.932Z: Workers have started successfully.
    Sep 15, 2021 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:46:32.960Z: Workers have started successfully.
    Sep 15, 2021 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:47:02.125Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 15, 2021 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:47:02.250Z: Cleaning up.
    Sep 15, 2021 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:47:02.335Z: Stopping worker pool...
    Sep 15, 2021 6:49:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:49:26.307Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2021 6:49:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T06:49:26.342Z: Worker pool stopped.
    Sep 15, 2021 6:49:31 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-14_23_45_16-6956103674034291982 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0d25d108-b7c2-4cae-a366-b5d8933b1a71 and timestamp: 2021-09-15T06:49:31.473000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.224

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2021 6:49:31 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 35.526 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 11s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/k2g6epnqkxhig

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2426

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2426/display/redirect?page=changes>

Changes:

[dhuntsperger] updated Maven-to-Gradle conversion step in Java quickstart

[noreply] [BEAM-10913] - Updating Grafana from v6.7.3 to v8.1.2 (#15503)


------------------------------------------
[...truncated 337.32 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 50340803db392ee5f0c7210f39a33b6d
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 15, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 15, 2021 12:45:02 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 15, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 15, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 12:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@213658433]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2038704487]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 15, 2021 12:45:09 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 15, 2021 12:45:10 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 15, 2021 12:45:10 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 15, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 15, 2021 12:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 15, 2021 12:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 15, 2021 12:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7177806670991651038.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-X5XGp7BbdKofL5M3t1oJzMV0ZAKan7JAf7nTUN0SiPw.jar
    Sep 15, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 15, 2021 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 15, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 519ca277e1e0998cfb67febfcc1e33bc2b6b8a688f61f619a380e6d66c66326c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-UZyid-HgmYz7Z_6_zB4zvCtrimiPYfYZo4Dm1mxmMmw.pb
    Sep 15, 2021 12:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 15, 2021 12:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 15, 2021 12:45:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 15, 2021 12:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 15, 2021 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-14_17_45_17-675699696737844118?project=apache-beam-testing
    Sep 15, 2021 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-14_17_45_17-675699696737844118
    Sep 15, 2021 12:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-14_17_45_17-675699696737844118
    Sep 15, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-15T00:45:20.787Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 15, 2021 12:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:45:34.452Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 15, 2021 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:45:35.431Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 15, 2021 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:45:35.463Z: Expanding GroupByKey operations into optimizable parts.
    Sep 15, 2021 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:45:35.501Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 15, 2021 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:45:35.574Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 15, 2021 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:45:35.602Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 15, 2021 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:45:35.635Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 15, 2021 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:45:36.004Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 15, 2021 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:45:36.095Z: Starting 5 workers in us-central1-c...
    Sep 15, 2021 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:45:52.398Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 15, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:46:20.785Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 15, 2021 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:46:51.440Z: Workers have started successfully.
    Sep 15, 2021 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:46:51.479Z: Workers have started successfully.
    Sep 15, 2021 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:47:22.105Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 15, 2021 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:47:22.295Z: Cleaning up.
    Sep 15, 2021 12:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:47:22.409Z: Stopping worker pool...
    Sep 15, 2021 12:49:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:49:44.822Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 15, 2021 12:49:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-15T00:49:44.878Z: Worker pool stopped.
    Sep 15, 2021 12:49:51 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-14_17_45_17-675699696737844118 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f4c762cf-36d5-4008-80fe-b052bee83311 and timestamp: 2021-09-15T00:49:51.298000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.062

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 15, 2021 12:49:51 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.039 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 53.857 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 30s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/xntrnciymt4o2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2425

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2425/display/redirect>

Changes:


------------------------------------------
[...truncated 337.59 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':',5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 50340803db392ee5f0c7210f39a33b6d
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 14, 2021 6:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 14, 2021 6:44:59 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 14, 2021 6:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 14, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@213658433]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2038704487]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 14, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 14, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3266677501138577389.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-w30y57adrBjTMjjiNGAUi9tKtDpw1Ti_nJ60PUS7xuY.jar
    Sep 14, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 14, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 80f7baa768dfcd015af5ffa368b4e3f62cbb8ad533c90bbe89da4f5b07815aff> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-gPe6p2jfzQFa9f-jaLTj9iy7itUzyQu-idpPWweBWv8.pb
    Sep 14, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 14, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 14, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 14, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-14_11_45_14-8029357531428479480?project=apache-beam-testing
    Sep 14, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-14_11_45_14-8029357531428479480
    Sep 14, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-14_11_45_14-8029357531428479480
    Sep 14, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-14T18:45:17.810Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:45:23.630Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 14, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:45:24.484Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:45:24.517Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:45:24.548Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:45:24.617Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:45:24.638Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:45:24.661Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 14, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:45:25.028Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 14, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:45:25.097Z: Starting 5 workers in us-central1-a...
    Sep 14, 2021 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:45:34.013Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2021 6:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:46:08.638Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2021 6:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:46:34.989Z: Workers have started successfully.
    Sep 14, 2021 6:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:46:35.018Z: Workers have started successfully.
    Sep 14, 2021 6:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:47:04.622Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 14, 2021 6:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:47:04.780Z: Cleaning up.
    Sep 14, 2021 6:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:47:04.849Z: Stopping worker pool...
    Sep 14, 2021 6:49:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:49:21.124Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2021 6:49:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T18:49:21.169Z: Worker pool stopped.
    Sep 14, 2021 6:49:28 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-14_11_45_14-8029357531428479480 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5d4c5175-56c9-4858-a5d2-ddc9283bec6f and timestamp: 2021-09-14T18:49:28.965000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.196

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2021 6:49:29 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 34.923 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/avqtz2lckr3em

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2424

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2424/display/redirect>

Changes:


------------------------------------------
[...truncated 338.53 KB...]
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 50340803db392ee5f0c7210f39a33b6d
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 14, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 14, 2021 12:45:12 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 14, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 14, 2021 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 12:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 12:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@911883870]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1058462591]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2021 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 12:45:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2021 12:45:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 14, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 14, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8340718730747719098.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KCoGKIaDBca9RILfrN52yQ_6X7hVo6qXD09WT4ntTUs.jar
    Sep 14, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 14, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 10396efb33699286baaf518c503544d219372589011a4e97f276e196d0bd6fd4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EDlu-zNpkoa6r1GMUDVE0hk3JYkBGk6X8nbhltC9b9Q.pb
    Sep 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 14, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-14_05_45_27-4524187170090040190?project=apache-beam-testing
    Sep 14, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-14_05_45_27-4524187170090040190
    Sep 14, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-14_05_45_27-4524187170090040190
    Sep 14, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-14T12:45:30.718Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:45:38.192Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 14, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:45:39.130Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:45:39.169Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:45:39.190Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:45:39.271Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:45:39.299Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:45:39.332Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 14, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:45:39.663Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 14, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:45:39.733Z: Starting 5 workers in us-central1-c...
    Sep 14, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:45:57.279Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:46:14.997Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 14, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:46:15.030Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 14, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:46:25.226Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:46:49.598Z: Workers have started successfully.
    Sep 14, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:46:49.628Z: Workers have started successfully.
    Sep 14, 2021 12:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:47:21.648Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 14, 2021 12:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:47:21.802Z: Cleaning up.
    Sep 14, 2021 12:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:47:21.881Z: Stopping worker pool...
    Sep 14, 2021 12:49:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:49:43.657Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2021 12:49:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T12:49:43.734Z: Worker pool stopped.
    Sep 14, 2021 12:49:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-14_05_45_27-4524187170090040190 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 14eeaa76-e500-489a-8a3d-82ff2bd734a0 and timestamp: 2021-09-14T12:49:50.386000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.465

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2021 12:49:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.046 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 44.889 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 26s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/4apcgrbbyiugi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2423

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2423/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11980] Java GCS - Implement IO Request Count metrics (#15394)


------------------------------------------
[...truncated 346.50 KB...]
Gradle Test Executor 3 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 50340803db392ee5f0c7210f39a33b6d
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 3'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 3'
Successfully started process 'Gradle Test Executor 3'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 14, 2021 6:47:41 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 14, 2021 6:47:42 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 14, 2021 6:47:42 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 14, 2021 6:47:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 6:47:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:47:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 6:47:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 6:47:47 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:47:47 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 6:47:48 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@213658433]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2021 6:47:48 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2021 6:47:48 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:47:48 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 6:47:48 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2021 6:47:48 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:47:48 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 6:47:49 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2038704487]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2021 6:47:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 6:47:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:47:49 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 6:47:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 6:47:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 6:47:49 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 6:47:49 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2021 6:47:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 14, 2021 6:47:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2021 6:47:54 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2021 6:47:54 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-WbmBv0W8lCroww4HLV2iheSTEdg9hX2CL8kP5tsmd-0.jar
    Sep 14, 2021 6:47:54 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7311546230794284275.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-njW_bkGpz7HjBKJPPTev1JaK1i0yExhGjkvvEaOyfis.jar
    Sep 14, 2021 6:47:55 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 14, 2021 6:47:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2021 6:47:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash cb29fcf98b9eae1d3729108395eb5b2d37ed03a92d92391516e2b59c5ad6b5e6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-yyn8-Yuerh03KRCDletbLTftA6ktkjkVFuK1nFrWteY.pb
    Sep 14, 2021 6:47:57 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2021 6:47:58 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 14, 2021 6:47:58 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 14, 2021 6:47:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 14, 2021 6:47:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_23_47_58-3252105525232313110?project=apache-beam-testing
    Sep 14, 2021 6:47:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-13_23_47_58-3252105525232313110
    Sep 14, 2021 6:47:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-13_23_47_58-3252105525232313110
    Sep 14, 2021 6:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-14T06:48:02.194Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:08.445Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 14, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:09.177Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:09.217Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:09.258Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:09.335Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:09.363Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:09.385Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 14, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:09.696Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 14, 2021 6:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:09.776Z: Starting 5 workers in us-central1-a...
    Sep 14, 2021 6:48:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:32.989Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2021 6:48:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:39.806Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 14, 2021 6:48:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:39.836Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 14, 2021 6:48:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:48:50.079Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2021 6:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:49:15.074Z: Workers have started successfully.
    Sep 14, 2021 6:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:49:15.094Z: Workers have started successfully.
    Sep 14, 2021 6:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:49:42.814Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 14, 2021 6:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:49:42.953Z: Cleaning up.
    Sep 14, 2021 6:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:49:43.024Z: Stopping worker pool...
    Sep 14, 2021 6:52:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:52:10.924Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2021 6:52:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T06:52:10.961Z: Worker pool stopped.
    Sep 14, 2021 6:52:16 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-13_23_47_58-3252105525232313110 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0ff2fefd-320e-44c7-a6fe-57ff750f8ed9 and timestamp: 2021-09-14T06:52:16.624000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.488

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2021 6:52:17 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.038 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.048 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 40.05 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 55s
152 actionable tasks: 103 executed, 49 from cache

Publishing build scan...
https://gradle.com/s/63c6nby6atcme

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2422

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2422/display/redirect?page=changes>

Changes:

[dhuntsperger] fixed broken Python tab on HCatalog IO page

[noreply] Avoid apiary submission of job graph when it is not needed. (#15458)

[noreply] [BEAM-7261] Add support for BasicSessionCredentials for AWS credentials.

[noreply] Bump dataflow java container version to beam-master-20210913 (#15506)


------------------------------------------
[...truncated 338.57 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is e293a3d2c89581eb30e12c16bdf416da
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 14, 2021 12:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 14, 2021 12:45:01 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 14, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 14, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 14, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 14, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 14, 2021 12:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 14, 2021 12:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 14, 2021 12:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 14, 2021 12:45:08 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 14, 2021 12:45:08 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 14, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 14, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 14, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 14, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6578673708245297976.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-m3aFo7d7pNwdiUQ9cAtMczGU5ppzNILc62FVrD8oFDc.jar
    Sep 14, 2021 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 14, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 14, 2021 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 6f9d067de7ce905ac94723abdfe5f25fc0b59895316e6ef75d7c807e6366899d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-b50GfefOkFrJRyOr3-XyX8C1mJUxbm73XXyAfmNmiZ0.pb
    Sep 14, 2021 12:45:14 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 14, 2021 12:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 14, 2021 12:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 14, 2021 12:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 14, 2021 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_17_45_15-13702946519292313061?project=apache-beam-testing
    Sep 14, 2021 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-13_17_45_15-13702946519292313061
    Sep 14, 2021 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-13_17_45_15-13702946519292313061
    Sep 14, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-14T00:45:18.736Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 14, 2021 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:45:24.870Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 14, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:45:25.708Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 14, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:45:25.756Z: Expanding GroupByKey operations into optimizable parts.
    Sep 14, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:45:25.912Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 14, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:45:25.986Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 14, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:45:26.008Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 14, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:45:26.039Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 14, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:45:26.390Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 14, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:45:26.477Z: Starting 5 workers in us-central1-a...
    Sep 14, 2021 12:45:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:45:49.951Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 14, 2021 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:46:00.128Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 14, 2021 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:46:00.170Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 14, 2021 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:46:10.463Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 14, 2021 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:46:35.334Z: Workers have started successfully.
    Sep 14, 2021 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:46:35.360Z: Workers have started successfully.
    Sep 14, 2021 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:47:01.175Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 14, 2021 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:47:01.319Z: Cleaning up.
    Sep 14, 2021 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:47:01.386Z: Stopping worker pool...
    Sep 14, 2021 12:49:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:49:32.131Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 14, 2021 12:49:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-14T00:49:32.172Z: Worker pool stopped.
    Sep 14, 2021 12:49:39 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-13_17_45_15-13702946519292313061 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b525c846-870e-4c63-b79c-b353eba80db6 and timestamp: 2021-09-14T00:49:39.386000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     6.871

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 14, 2021 12:49:40 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 43.504 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 19s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/wyj5k7j7bxfdw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2421

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2421/display/redirect?page=changes>

Changes:

[rohde.samuel] [BEAM-12842] Add timestamp to test work item to deflake

[suztomo] [BEAM-12873] HL7v2IO: to leave schematizedData null, not empty


------------------------------------------
[...truncated 338.70 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 49dffaa09b8e1afd253a5b200b3716de
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 13, 2021 6:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 13, 2021 6:45:00 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 13, 2021 6:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 13, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@213658433]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2038704487]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 13, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 13, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1142033632284406031.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-5BRHaXDDEyXIDNzk-dK8PY-kfdCHVQ4xFzY32HCbDXc.jar
    Sep 13, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 13, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 7c0eadb156e7f83781b6dfb43b3fe90bfa691bb00fdcc3ec9545cdb90048c6e3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-fA6tsVbn-DeBtt-0Oz_pC_ppG7AP3MPslUXNuQBIxuM.pb
    Sep 13, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 13, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 13, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 13, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_11_45_14-4796777176590874877?project=apache-beam-testing
    Sep 13, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-13_11_45_14-4796777176590874877
    Sep 13, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-13_11_45_14-4796777176590874877
    Sep 13, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-13T18:45:18.397Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:45:26.089Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 13, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:45:27.114Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 13, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:45:27.231Z: Expanding GroupByKey operations into optimizable parts.
    Sep 13, 2021 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:45:27.296Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 13, 2021 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:45:27.435Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 13, 2021 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:45:27.471Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 13, 2021 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:45:27.506Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 13, 2021 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:45:27.883Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 13, 2021 6:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:45:27.944Z: Starting 5 workers in us-central1-c...
    Sep 13, 2021 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:45:52.887Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2021 6:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-13T18:46:02.360Z: Autoscaling: Startup of the worker pool in zone us-central1-c reached 2 workers, but the goal was 5 workers. The service will retry. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2500.0 in region us-central1.
    Sep 13, 2021 6:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:46:02.404Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 13, 2021 6:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:46:02.424Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 13, 2021 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:46:37.389Z: Workers have started successfully.
    Sep 13, 2021 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:46:37.423Z: Workers have started successfully.
    Sep 13, 2021 6:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:47:25.242Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 13, 2021 6:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:47:25.450Z: Cleaning up.
    Sep 13, 2021 6:47:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:47:25.530Z: Stopping worker pool...
    Sep 13, 2021 6:49:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:49:43.008Z: Autoscaling: Resized worker pool from 1 to 0.
    Sep 13, 2021 6:49:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T18:49:43.062Z: Worker pool stopped.
    Sep 13, 2021 6:49:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-13_11_45_14-4796777176590874877 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ee48ed5b-7218-4d2d-95c5-1e26a464b6e0 and timestamp: 2021-09-13T18:49:49.658000000Z:
                     Metric:                    Value:
                   read_time                    21.855
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2021 6:49:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 54.571 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 28s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/wpkdsqrte3hby

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2420

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2420/display/redirect>

Changes:


------------------------------------------
[...truncated 335.71 KB...]

> Task :sdks:java:extensions:sql:testJar
Caching disabled for task ':sdks:java:extensions:sql:testJar' because:
  Caching has not been enabled for the task
Task ':sdks:java:extensions:sql:testJar' is not up-to-date because:
  No history is available.
:sdks:java:extensions:sql:testJar (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.128 secs.
:sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:compileTestJava'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is ab74a7458097b86331496e153b6b4789
Task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' with cache key ab74a7458097b86331496e153b6b4789
:sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.184 secs.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 9bf3ac3215333546daf9554f53be71e8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 13, 2021 12:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 13, 2021 12:44:57 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 13, 2021 12:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 13, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@213658433]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2038704487]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 13, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 13, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1852039557803765943.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-O9Bv4xdnkJahZJ_O2D-cQHhcGjOe0L9Dco9GvB5DnUQ.jar
    Sep 13, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 13, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash e4ffd97ad7fe463e3f45c4d01f9a0a23536808c4ca7148719cc2ff580d27a27b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-5P_Zetf-Rj4_RcTQH5oKI1NoCMTKcUhxnML_WA0nons.pb
    Sep 13, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 13, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 13, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 13, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-13_05_45_12-11719669428517226292?project=apache-beam-testing
    Sep 13, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-13_05_45_12-11719669428517226292
    Sep 13, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-13_05_45_12-11719669428517226292
    Sep 13, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-13T12:45:16.137Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T12:45:24.799Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 13, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2021-09-13T12:45:25.799Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/24385 instances, 2/0 CPUs, 25/183671 disk GB, 0/2397 SSD disk GB, 1/288 instance groups, 1/291 managed instance groups, 1/517 instance templates, 1/614 in-use IP addresses.

    Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
    Sep 13, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T12:45:25.836Z: Cleaning up.
    Sep 13, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T12:45:25.886Z: Worker pool stopped.
    Sep 13, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T12:45:27.060Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-13_05_45_12-11719669428517226292 failed with status FAILED.
    Sep 13, 2021 12:45:32 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
    Sep 13, 2021 12:45:33 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): db4cc790-e976-4cbd-af70-e06485dffd9c and timestamp: 2021-09-13T12:45:32.695000000Z:
                     Metric:                    Value:
                   read_time                       0.0
                 fields_read                      -1.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2021 12:45:33 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 40.449 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 14s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/xhnwe4kstskhe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2419

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2419/display/redirect>

Changes:


------------------------------------------
[...truncated 336.12 KB...]
Skipping task ':sdks:java:extensions:sql:testClasses' as it has no actions.
:sdks:java:extensions:sql:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:testJar (Thread[Execution worker for ':' Thread 10,5,main]) started.

> Task :sdks:java:extensions:sql:testJar
Caching disabled for task ':sdks:java:extensions:sql:testJar' because:
  Caching has not been enabled for the task
Task ':sdks:java:extensions:sql:testJar' is not up-to-date because:
  No history is available.
:sdks:java:extensions:sql:testJar (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.131 secs.
:sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Execution worker for ':' Thread 10,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:compileTestJava'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is ab74a7458097b86331496e153b6b4789
Task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' with cache key ab74a7458097b86331496e153b6b4789
:sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.173 secs.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 9bf3ac3215333546daf9554f53be71e8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 13, 2021 6:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 13, 2021 6:44:58 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 13, 2021 6:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 13, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 13, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2021 6:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2021 6:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 13, 2021 6:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7879367242566746536.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qKmj_qzCpRvs-LS2WaXHpClOoMP9oKEzApWFwp1ZpGE.jar
    Sep 13, 2021 6:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 13, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 6d24cf6da9eec3dc96f960dc8d22c9f662e42cdef3573233e24b59f415d6a97a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-bSTPbanuw9yW-WDcjSLJ9mLkLN7zVzIz4ktZ9BXWqXo.pb
    Sep 13, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 13, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 13, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 13, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_23_45_12-9724516326294111648?project=apache-beam-testing
    Sep 13, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-12_23_45_12-9724516326294111648
    Sep 13, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-12_23_45_12-9724516326294111648
    Sep 13, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-13T06:45:16.424Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T06:45:23.050Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 13, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2021-09-13T06:45:23.740Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/24386 instances, 2/0 CPUs, 25/183711 disk GB, 0/2397 SSD disk GB, 1/283 instance groups, 1/286 managed instance groups, 1/512 instance templates, 1/615 in-use IP addresses.

    Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
    Sep 13, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T06:45:23.801Z: Cleaning up.
    Sep 13, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T06:45:23.887Z: Worker pool stopped.
    Sep 13, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T06:45:25.091Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-12_23_45_12-9724516326294111648 failed with status FAILED.
    Sep 13, 2021 6:45:28 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
    Sep 13, 2021 6:45:28 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 78fec180-e8b8-48b3-ba19-aaa8ed48fed6 and timestamp: 2021-09-13T06:45:28.353000000Z:
                     Metric:                    Value:
                   read_time                       0.0
                 fields_read                      -1.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2021 6:45:28 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.059 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 35.42 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/khs3zi7l7skp4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2418

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2418/display/redirect>

Changes:


------------------------------------------
[...truncated 338.95 KB...]
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 13, 2021 12:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 13, 2021 12:44:58 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 13, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 13, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 13, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 13, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 13, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 13, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 13, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 13, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 13, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 13, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 13, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 13, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 13, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test919607493275932945.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-OTE8YSDUWBG4Sgv33rUQ44s37P6NZomVlUNtIT3-SRM.jar
    Sep 13, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 13, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 13, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 80baffd28ad82b6c94e9af3624894809a486f5e309c4d5039f92f41949d12497> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-gLr_0orYK2yU6a82JIlICaSG9eMJxNUDn5L0GUnRJJc.pb
    Sep 13, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 13, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 13, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 13, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 13, 2021 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_17_45_13-14486603282926067895?project=apache-beam-testing
    Sep 13, 2021 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-12_17_45_13-14486603282926067895
    Sep 13, 2021 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-12_17_45_13-14486603282926067895
    Sep 13, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-13T00:45:17.004Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 13, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:45:22.947Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 13, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:45:23.616Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 13, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:45:23.655Z: Expanding GroupByKey operations into optimizable parts.
    Sep 13, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:45:23.682Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 13, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:45:23.755Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 13, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:45:23.789Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 13, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:45:23.821Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 13, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:45:24.163Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 13, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:45:24.239Z: Starting 5 workers in us-central1-c...
    Sep 13, 2021 12:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:45:30.613Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 13, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-13T00:46:06.832Z: Autoscaling: Startup of the worker pool in zone us-central1-c reached 1 workers, but the goal was 5 workers. The service will retry. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2500.0 in region us-central1.
    Sep 13, 2021 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:46:19.227Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 13, 2021 12:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:46:19.289Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 13, 2021 12:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:46:39.798Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 13, 2021 12:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:46:39.833Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 13, 2021 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:46:40.974Z: Workers have started successfully.
    Sep 13, 2021 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:46:41.004Z: Workers have started successfully.
    Sep 13, 2021 12:47:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-13T00:47:04.578Z: Autoscaling: Unable to reach resize target in zone us-central1-c. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2500.0 in region us-central1.
    Sep 13, 2021 12:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:47:28.068Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 13, 2021 12:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:47:28.193Z: Cleaning up.
    Sep 13, 2021 12:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:47:28.276Z: Stopping worker pool...
    Sep 13, 2021 12:49:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:49:52.830Z: Autoscaling: Resized worker pool from 2 to 0.
    Sep 13, 2021 12:49:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-13T00:49:52.880Z: Worker pool stopped.
    Sep 13, 2021 12:49:58 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-12_17_45_13-14486603282926067895 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9a498a4f-8c9e-40cf-b676-44d9d4ea99fb and timestamp: 2021-09-13T00:49:58.187000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    23.894

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 13, 2021 12:49:58 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 5 mins 4.566 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 38s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/pmpyp6li6rni4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2417

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2417/display/redirect>

Changes:


------------------------------------------
[...truncated 333.78 KB...]
Skipping task ':sdks:java:extensions:sql:testClasses' as it has no actions.
:sdks:java:extensions:sql:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:testJar (Thread[Daemon worker,5,main]) started.

> Task :sdks:java:extensions:sql:testJar
Caching disabled for task ':sdks:java:extensions:sql:testJar' because:
  Caching has not been enabled for the task
Task ':sdks:java:extensions:sql:testJar' is not up-to-date because:
  No history is available.
:sdks:java:extensions:sql:testJar (Thread[Daemon worker,5,main]) completed. Took 0.218 secs.
:sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Daemon worker,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:compileTestJava'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is ab74a7458097b86331496e153b6b4789
Task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' with cache key ab74a7458097b86331496e153b6b4789
:sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Daemon worker,5,main]) completed. Took 0.157 secs.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 9bf3ac3215333546daf9554f53be71e8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 12, 2021 6:44:55 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 12, 2021 6:44:56 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 12, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 12, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 12, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 12, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4658627405174761352.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XtODq3XsBYoVJ0x-Y_7UcmvyUjcNsw6Lqn66_Rv14z4.jar
    Sep 12, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 12, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash b50d3976b1f658766884321503045358fa3f293d6b7bfeacc6a1596a98a683cf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tQ05drH2WHZohDIVAwRTWPo_KT1re_6sxqFZapimg88.pb
    Sep 12, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 12, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 12, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 12, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_11_45_09-1474197715117398901?project=apache-beam-testing
    Sep 12, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-12_11_45_09-1474197715117398901
    Sep 12, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-12_11_45_09-1474197715117398901
    Sep 12, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-12T18:45:13.629Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T18:45:21.784Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 12, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2021-09-12T18:45:22.383Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/24385 instances, 2/0 CPUs, 25/183691 disk GB, 0/2397 SSD disk GB, 1/281 instance groups, 1/284 managed instance groups, 1/510 instance templates, 1/614 in-use IP addresses.

    Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
    Sep 12, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T18:45:22.416Z: Cleaning up.
    Sep 12, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T18:45:22.468Z: Worker pool stopped.
    Sep 12, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T18:45:23.839Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2021 6:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-12_11_45_09-1474197715117398901 failed with status FAILED.
    Sep 12, 2021 6:45:27 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
    Sep 12, 2021 6:45:29 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d1c064a5-b49d-4404-aedd-c622c2ca9c36 and timestamp: 2021-09-12T18:45:27.676000000Z:
                     Metric:                    Value:
                 fields_read                      -1.0
                   read_time                       0.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2021 6:45:29 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 37.754 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/ngj5tk2een2ra

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2416

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2416/display/redirect>

Changes:


------------------------------------------
[...truncated 339.28 KB...]
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 12, 2021 12:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 12, 2021 12:44:56 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 12, 2021 12:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 12, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 12, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 12, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 12, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7486825016035298335.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SAgVxt_P5-e8wDcOUrg7FNrbWsIqJnZRAvDbV8EKUXk.jar
    Sep 12, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.thrift/libthrift/0.14.1/85348a0c44c298bbec5ae747e67ae12e60b3aef6/libthrift-0.14.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/libthrift-0.14.1-WzUQ_nLm8HJeKc7269seq6zMxp15_E7VC2gWAKh2Z-w.jar
    Sep 12, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.tomcat.embed/tomcat-embed-core/8.5.46/5d686394334d143f48251827435ab086a161e75e/tomcat-embed-core-8.5.46.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tomcat-embed-core-8.5.46-vl-FREjS7l1uADb-srT3ExYweaG2uaepdQjlWRetNcI.jar
    Sep 12, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.tomcat/tomcat-annotations-api/8.5.46/56c67699de192c603afd6f029e80e5ff8d98e7e9/tomcat-annotations-api-8.5.46.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tomcat-annotations-api-8.5.46-amtG0OaVhkRRTAyjZYs7B-YSOmgqIO4203lSQnNfq8M.jar
    Sep 12, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 4 files newly uploaded in 1 seconds
    Sep 12, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 9a503c998f9d4b1acab9c171e4aaaff17ea57c901ea7894659dfc58e5c37ed19> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-mlA8mY-dSxrKucFx5Kqv8X6lfJAep4lGWd_Fjlw37Rk.pb
    Sep 12, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 12, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 12, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 12, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-12_05_45_10-3693096024963707878?project=apache-beam-testing
    Sep 12, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-12_05_45_10-3693096024963707878
    Sep 12, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-12_05_45_10-3693096024963707878
    Sep 12, 2021 12:45:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-12T12:45:14.457Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:45:20.860Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 12, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:45:21.657Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 12, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:45:21.708Z: Expanding GroupByKey operations into optimizable parts.
    Sep 12, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:45:21.737Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 12, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:45:21.799Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 12, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:45:21.849Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 12, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:45:21.893Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 12, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:45:22.216Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 12, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:45:22.288Z: Starting 5 workers in us-central1-a...
    Sep 12, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:45:28.013Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-12T12:45:54.991Z: Autoscaling: Startup of the worker pool in zone us-central1-a reached 3 workers, but the goal was 5 workers. The service will retry. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2500.0 in region us-central1.
    Sep 12, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:46:09.136Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 12, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:46:09.162Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 12, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:46:28.237Z: Workers have started successfully.
    Sep 12, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:46:28.268Z: Workers have started successfully.
    Sep 12, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:46:57.197Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 12, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:46:57.334Z: Cleaning up.
    Sep 12, 2021 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:46:57.421Z: Stopping worker pool...
    Sep 12, 2021 12:49:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:49:31.871Z: Autoscaling: Resized worker pool from 3 to 0.
    Sep 12, 2021 12:49:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T12:49:31.913Z: Worker pool stopped.
    Sep 12, 2021 12:49:39 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-12_05_45_10-3693096024963707878 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a8ef74cc-7836-4705-8fd5-d5941c1d45e7 and timestamp: 2021-09-12T12:49:39.153000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.982

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2021 12:49:39 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 47.452 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 19s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/3gtlzupqhijsg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2415

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2415/display/redirect>

Changes:


------------------------------------------
[...truncated 335.06 KB...]
Skipping task ':sdks:java:extensions:sql:testClasses' as it has no actions.
:sdks:java:extensions:sql:testClasses (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:testJar (Thread[Execution worker for ':' Thread 4,5,main]) started.

> Task :sdks:java:extensions:sql:testJar
Caching disabled for task ':sdks:java:extensions:sql:testJar' because:
  Caching has not been enabled for the task
Task ':sdks:java:extensions:sql:testJar' is not up-to-date because:
  No history is available.
:sdks:java:extensions:sql:testJar (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.127 secs.
:sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Execution worker for ':' Thread 4,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:compileTestJava FROM-CACHE
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:compileTestJava'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is ab74a7458097b86331496e153b6b4789
Task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' with cache key ab74a7458097b86331496e153b6b4789
:sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.15 secs.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 4,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 9bf3ac3215333546daf9554f53be71e8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 12, 2021 6:44:54 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 12, 2021 6:44:55 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 12, 2021 6:44:55 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 12, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 12, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 12, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 12, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1793329628853421440.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-fX0S2Ewuz9TQZPBXli5pmdUsdeKyMkXMTarrRTsp8a0.jar
    Sep 12, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 12, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash b84fa1fe57b1ea614ab9cc6a33a8143241ccbc557c84b2807777901be7a4e118> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-uE-h_lex6mFKucxqM6gUMkHMvFV8hLKAd3eQG-ek4Rg.pb
    Sep 12, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 12, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 12, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 12, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-11_23_45_08-8735805944682801155?project=apache-beam-testing
    Sep 12, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-11_23_45_08-8735805944682801155
    Sep 12, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-11_23_45_08-8735805944682801155
    Sep 12, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-12T06:45:12.499Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T06:45:19.075Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 12, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2021-09-12T06:45:19.752Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/24365 instances, 2/0 CPUs, 25/184816 disk GB, 0/2397 SSD disk GB, 1/274 instance groups, 1/277 managed instance groups, 1/503 instance templates, 1/594 in-use IP addresses.

    Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
    Sep 12, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T06:45:19.787Z: Cleaning up.
    Sep 12, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T06:45:19.835Z: Worker pool stopped.
    Sep 12, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T06:45:20.999Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2021 6:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-11_23_45_08-8735805944682801155 failed with status FAILED.
    Sep 12, 2021 6:45:25 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
    Sep 12, 2021 6:45:25 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): bdcfb102-e766-4104-82b0-a27b774a0268 and timestamp: 2021-09-12T06:45:25.230000000Z:
                     Metric:                    Value:
                   read_time                       0.0
                 fields_read                      -1.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2021 6:45:25 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 35.07 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 8s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/s7puq6ro3in2a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2414

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2414/display/redirect>

Changes:


------------------------------------------
[...truncated 337.78 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 9bf3ac3215333546daf9554f53be71e8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 12, 2021 12:44:55 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 12, 2021 12:44:56 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 12, 2021 12:44:56 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 12, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@213658433]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 12, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2038704487]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 12, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 12, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 12, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 12, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 12, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test826078511669609757.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TkaelBOxE8Li7v4Bzbt5z3ny7wOv9q-y1TxGbvxZyTw.jar
    Sep 12, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 12, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 12, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103996 bytes, hash 29035ee44a670599ab55db3876558b11d682351762a83b4bce6728ddec277cde> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KQNe5EpnBZmrVds4dlWLEdaCNRdiqDtLzmco3ewnfN4.pb
    Sep 12, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 12, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 12, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 12, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 12, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-11_17_45_09-11099528075843360802?project=apache-beam-testing
    Sep 12, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-11_17_45_09-11099528075843360802
    Sep 12, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-11_17_45_09-11099528075843360802
    Sep 12, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-12T00:45:12.848Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 12, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:19.634Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 12, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:20.362Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 12, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:20.400Z: Expanding GroupByKey operations into optimizable parts.
    Sep 12, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:20.426Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 12, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:20.481Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 12, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:20.514Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 12, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:20.543Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 12, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:20.849Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 12, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:20.942Z: Starting 5 workers in us-central1-c...
    Sep 12, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:41.563Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 12, 2021 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:50.887Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 12, 2021 12:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:45:50.920Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 12, 2021 12:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:46:01.171Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 12, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:46:26.260Z: Workers have started successfully.
    Sep 12, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:46:26.288Z: Workers have started successfully.
    Sep 12, 2021 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:46:54.886Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 12, 2021 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:46:55.018Z: Cleaning up.
    Sep 12, 2021 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:46:55.122Z: Stopping worker pool...
    Sep 12, 2021 12:49:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:49:20.764Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 12, 2021 12:49:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-12T00:49:20.822Z: Worker pool stopped.
    Sep 12, 2021 12:49:26 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-11_17_45_09-11099528075843360802 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 06ecd640-683c-4082-be1f-d93fa9fa0df7 and timestamp: 2021-09-12T00:49:26.199000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.133

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 12, 2021 12:49:27 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 35.531 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 7s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/ndq7hxkuciriq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2413

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2413/display/redirect>

Changes:


------------------------------------------
[...truncated 337.29 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 9bf3ac3215333546daf9554f53be71e8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 11, 2021 6:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 11, 2021 6:45:02 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 11, 2021 6:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 11, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@213658433]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 6:45:08 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2038704487]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2021 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 6:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2021 6:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 11, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2021 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2021 6:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 11, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3499213969173461346.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-D9zJaYxvS8tJzo-57Dj1FSezsatwIbrwZVy9-19IHcg.jar
    Sep 11, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 11, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103998 bytes, hash 66cbbca901a3a4057e1535b113fa75c39e318520d2ddf2eeabffc9b0b05c10cf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Zsu8qQGjpAV-FTWxE_p1w54xhSDS3fLuq__JsLBcEM8.pb
    Sep 11, 2021 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2021 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 11, 2021 6:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 11, 2021 6:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 11, 2021 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-11_11_45_16-18074417737114682408?project=apache-beam-testing
    Sep 11, 2021 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-11_11_45_16-18074417737114682408
    Sep 11, 2021 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-11_11_45_16-18074417737114682408
    Sep 11, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-11T18:45:19.965Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:45:25.892Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 11, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:45:26.748Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:45:26.786Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:45:26.815Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:45:26.880Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:45:26.908Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:45:26.944Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 11, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:45:27.292Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 11, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:45:27.371Z: Starting 5 workers in us-central1-a...
    Sep 11, 2021 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:45:33.088Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2021 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:46:11.316Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 11, 2021 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:46:37.740Z: Workers have started successfully.
    Sep 11, 2021 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:46:37.771Z: Workers have started successfully.
    Sep 11, 2021 6:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:47:07.105Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 11, 2021 6:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:47:07.256Z: Cleaning up.
    Sep 11, 2021 6:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:47:07.333Z: Stopping worker pool...
    Sep 11, 2021 6:49:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:49:30.966Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 11, 2021 6:49:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T18:49:31.005Z: Worker pool stopped.
    Sep 11, 2021 6:49:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-11_11_45_16-18074417737114682408 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 45ef05c3-6710-44c2-9bcd-20e799557726 and timestamp: 2021-09-11T18:49:36.222000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.422

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2021 6:49:36 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 39.649 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 14s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/crqdafydtueqi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2412

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2412/display/redirect>

Changes:


------------------------------------------
[...truncated 337.65 KB...]
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 9bf3ac3215333546daf9554f53be71e8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 11, 2021 12:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 11, 2021 12:44:58 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 11, 2021 12:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 11, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 11, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 11, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5935111599173117703.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-PQfxmt2I_IHFUE2JE4FhArba1SRch9g5yH-KJxFzNJM.jar
    Sep 11, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 11, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash dac468a22a4b28596f2b93ef724021c417aa3f2995da5dd17d20d9f2455c1889> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2sRooipLKFlvK5PvckAhxBeqPymV2l3RfSDZ8kVcGIk.pb
    Sep 11, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 11, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 11, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 11, 2021 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-11_05_45_13-15054943979245899383?project=apache-beam-testing
    Sep 11, 2021 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-11_05_45_13-15054943979245899383
    Sep 11, 2021 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-11_05_45_13-15054943979245899383
    Sep 11, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-11T12:45:18.653Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:45:25.036Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:45:25.791Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:45:25.831Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:45:25.856Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:45:25.929Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:45:25.955Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:45:25.996Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:45:26.293Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 11, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:45:26.363Z: Starting 5 workers in us-central1-a...
    Sep 11, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:45:56.341Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-11T12:45:58.238Z: Autoscaling: Startup of the worker pool in zone us-central1-a reached 1 workers, but the goal was 5 workers. The service will retry. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2500.0 in region us-central1.
    Sep 11, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2021-09-11T12:46:10.736Z: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 5 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2500.0 in region us-central1.
    Sep 11, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2021-09-11T12:46:10.762Z: Workflow failed.
    Sep 11, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:46:10.827Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 11, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:46:10.898Z: Cleaning up.
    Sep 11, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:46:10.957Z: Stopping worker pool...
    Sep 11, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T12:46:27.073Z: Worker pool stopped.
    Sep 11, 2021 12:46:33 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-11_05_45_13-15054943979245899383 failed with status FAILED.
    Sep 11, 2021 12:46:33 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
    Sep 11, 2021 12:46:34 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9fd9c040-7698-4ee3-86bd-124c76c3ebce and timestamp: 2021-09-11T12:46:33.743000000Z:
                     Metric:                    Value:
                   read_time                       0.0
                 fields_read                      -1.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2021 12:46:34 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 1 mins 41.435 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 15s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/wchqsqip67p36

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2411

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2411/display/redirect>

Changes:


------------------------------------------
[...truncated 336.72 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 9bf3ac3215333546daf9554f53be71e8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 11, 2021 6:44:56 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 11, 2021 6:44:57 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 11, 2021 6:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 11, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 6:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 6:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 11, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 11, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test849136224923080782.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-KOXGO-JrQwQGgsriOgYXMQUQ9hZx3iQnv-N2ITUot1w.jar
    Sep 11, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 11, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 7edfb895ff05a07d75c6b91515f90393a6c53b1105d071a73e0b344dca5bca9d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ft-4lf8FoH11xrkVFfkDk6bFOxEF0HGnPgs0Tcpbyp0.pb
    Sep 11, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 11, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 11, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 11, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-10_23_45_10-11964116519564220217?project=apache-beam-testing
    Sep 11, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-10_23_45_10-11964116519564220217
    Sep 11, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-10_23_45_10-11964116519564220217
    Sep 11, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-11T06:45:14.184Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:45:20.369Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 11, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:45:21.102Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:45:21.134Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:45:21.171Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:45:21.245Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:45:21.273Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:45:21.308Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 11, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:45:21.675Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 11, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:45:21.758Z: Starting 5 workers in us-central1-a...
    Sep 11, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:45:52.584Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2021 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:45:58.250Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 11, 2021 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:46:23.794Z: Workers have started successfully.
    Sep 11, 2021 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:46:23.829Z: Workers have started successfully.
    Sep 11, 2021 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:46:51.657Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 11, 2021 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:46:51.807Z: Cleaning up.
    Sep 11, 2021 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:46:51.895Z: Stopping worker pool...
    Sep 11, 2021 6:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:49:14.040Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 11, 2021 6:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T06:49:14.082Z: Worker pool stopped.
    Sep 11, 2021 6:49:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-10_23_45_10-11964116519564220217 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c9e8fc3e-69d6-4945-890f-662096f22848 and timestamp: 2021-09-11T06:49:20.691000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                      6.96

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2021 6:49:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 28.468 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 0s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/usu6zowfkx4d4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2410

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2410/display/redirect?page=changes>

Changes:

[heejong] [BEAM-12805] Fix XLang CombinePerKey test by explicitly assigning the

[BenWhitehead] [BEAM-8376] Google Cloud Firestore Connector - Add handling for

[noreply] Decreasing peak memory usage for beam.TupleCombineFn (#15494)

[noreply] [BEAM-12802] Add support for prefetch through data layers down through

[noreply] [BEAM-11097] Add implementation of side input cache (#15483)


------------------------------------------
[...truncated 341.54 KB...]
> Task :sdks:java:extensions:sql:perf-tests:testClasses
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) started.
Gradle Test Executor 2 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 9bf3ac3215333546daf9554f53be71e8
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 11, 2021 12:46:22 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 11, 2021 12:46:23 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 11, 2021 12:46:23 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 11, 2021 12:46:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 12:46:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:46:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 12:46:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 12:46:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:46:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 12:46:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 11, 2021 12:46:29 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 11, 2021 12:46:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 11, 2021 12:46:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 11, 2021 12:46:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 11, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 11, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-YtvkOIsfQYfFzG89aVBO_8oNHoiij9SynwUJLdMkn5A.jar
    Sep 11, 2021 12:46:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5889603143190849404.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-4ObVKigR7akgH0SF93qu6ZbuTm3S_tjAlaGdLhLHDeI.jar
    Sep 11, 2021 12:46:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 11, 2021 12:46:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 11, 2021 12:46:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 72b471ff8a43d5c391d190cc2fbd851d2d132b298633fb71733ca0c0a65a4a06> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-crRx_4pD1cOR0ZDML72FHS0TKymGM_txczygwKZaSgY.pb
    Sep 11, 2021 12:46:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 11, 2021 12:46:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 11, 2021 12:46:37 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 11, 2021 12:46:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 11, 2021 12:46:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-10_17_46_37-7221468495849970836?project=apache-beam-testing
    Sep 11, 2021 12:46:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-10_17_46_37-7221468495849970836
    Sep 11, 2021 12:46:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-10_17_46_37-7221468495849970836
    Sep 11, 2021 12:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-11T00:46:41.590Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 11, 2021 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:46:52.987Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 11, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:46:53.820Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 11, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:46:53.855Z: Expanding GroupByKey operations into optimizable parts.
    Sep 11, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:46:53.895Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 11, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:46:53.950Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 11, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:46:53.986Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 11, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:46:54.015Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 11, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:46:54.325Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 11, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:46:54.392Z: Starting 5 workers in us-central1-c...
    Sep 11, 2021 12:47:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:47:14.690Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 11, 2021 12:47:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2021-09-11T00:47:29.879Z: Startup of the worker pool in zone us-central1-c failed to bring up any of the desired 5 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2500.0 in region us-central1.
    Sep 11, 2021 12:47:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2021-09-11T00:47:29.917Z: Workflow failed.
    Sep 11, 2021 12:47:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:47:29.992Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 11, 2021 12:47:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:47:30.080Z: Cleaning up.
    Sep 11, 2021 12:47:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:47:30.180Z: Stopping worker pool...
    Sep 11, 2021 12:47:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-11T00:47:48.900Z: Worker pool stopped.
    Sep 11, 2021 12:47:58 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-10_17_46_37-7221468495849970836 failed with status FAILED.
    Sep 11, 2021 12:47:58 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
    Sep 11, 2021 12:47:59 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3f7d9fed-bce0-4e58-b811-124514ebe7d9 and timestamp: 2021-09-11T00:47:58.903000000Z:
                     Metric:                    Value:
                   read_time                       0.0
                 fields_read                      -1.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 11, 2021 12:47:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 1 mins 40.845 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 37s
152 actionable tasks: 99 executed, 53 from cache

Publishing build scan...
https://gradle.com/s/kd4wmcmwdfh5w

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2409

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2409/display/redirect?page=changes>

Changes:

[noreply] Added type annotations to some combiners missing it. (#15414)

[noreply] [BEAM-12634] JmsIO auto scaling feature (#15464)

[noreply] [BEAM-12662] Get Flink version from cluster. (#15223)

[noreply] Port changes from Pub/Sub Lite to beam (#15418)


------------------------------------------
[...truncated 412.56 KB...]
Gradle Test Executor 12 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 38dcad72c650c7c44f736b7038162c5a
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 12'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 12'
Successfully started process 'Gradle Test Executor 12'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 10, 2021 7:00:24 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 10, 2021 7:00:25 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 10, 2021 7:00:26 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 10, 2021 7:00:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 7:00:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 7:00:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 7:00:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 7:00:30 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 7:00:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 7:00:30 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2021 7:00:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2021 7:00:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 7:00:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 7:00:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2021 7:00:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 7:00:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 7:00:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2021 7:00:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 7:00:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 7:00:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 7:00:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 7:00:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 7:00:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 7:00:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2021 7:00:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 10, 2021 7:00:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2021 7:00:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2021 7:00:35 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-h_pN1a12Kq6Q7UgG0NwuMuy2KLX9eKEV-XpQQIHwHqY.jar
    Sep 10, 2021 7:00:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4898916112002885638.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-DewQy2pSPyoXzN2QRqIxBShnnP0VTsZ0oI-sq4hpDSg.jar
    Sep 10, 2021 7:00:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 10, 2021 7:00:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2021 7:00:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103997 bytes, hash 89bc71e6dda8b7e8097504b810cb791447025affc285ccf6110fef6d9d1663bb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ibxx5t2ot-gJdQS4EMt5FEcCWv_Chcz2EQ_vbZ0WY7s.pb
    Sep 10, 2021 7:00:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2021 7:00:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 10, 2021 7:00:39 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 10, 2021 7:00:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 10, 2021 7:00:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-10_12_00_39-2700056579921925071?project=apache-beam-testing
    Sep 10, 2021 7:00:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-10_12_00_39-2700056579921925071
    Sep 10, 2021 7:00:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-10_12_00_39-2700056579921925071
    Sep 10, 2021 7:00:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-10T19:00:42.835Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2021 7:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:00:50.439Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 10, 2021 7:00:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:00:51.364Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 10, 2021 7:00:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:00:51.416Z: Expanding GroupByKey operations into optimizable parts.
    Sep 10, 2021 7:00:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:00:51.453Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 10, 2021 7:00:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:00:51.512Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 10, 2021 7:00:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:00:51.555Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 10, 2021 7:00:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:00:51.576Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 10, 2021 7:00:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:00:52.054Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 10, 2021 7:00:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:00:52.145Z: Starting 5 workers in us-central1-c...
    Sep 10, 2021 7:01:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:01:14.071Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2021 7:01:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:01:26.365Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 10, 2021 7:01:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:01:26.406Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 10, 2021 7:01:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:01:36.685Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 10, 2021 7:02:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:01:59.773Z: Workers have started successfully.
    Sep 10, 2021 7:02:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:01:59.799Z: Workers have started successfully.
    Sep 10, 2021 7:02:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:02:31.652Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 10, 2021 7:02:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:02:31.792Z: Cleaning up.
    Sep 10, 2021 7:02:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:02:31.870Z: Stopping worker pool...
    Sep 10, 2021 7:04:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:04:50.340Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 10, 2021 7:04:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T19:04:50.400Z: Worker pool stopped.
    Sep 10, 2021 7:04:57 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-10_12_00_39-2700056579921925071 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 54e62b3e-2a76-4792-b7bd-5d2ec120f94e and timestamp: 2021-09-10T19:04:57.879000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                      7.65

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2021 7:04:58 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 12 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 37.548 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 20m 29s
152 actionable tasks: 151 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/tk2k32ltybuao

Stopped 11 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2408

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2408/display/redirect?page=changes>

Changes:

[noreply] Register MapCoder, some comments/cleanup. (#15471)

[noreply] [BEAM-12588] Multimap user state proto changes (#15473)


------------------------------------------
[...truncated 339.92 KB...]
Gradle Test Executor 2 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 21766bf14437d8b28e4d907b55c035fa
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 10, 2021 12:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 10, 2021 12:45:18 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 10, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 10, 2021 12:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 12:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2021 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2021 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2021 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2021 12:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 10, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-h_pN1a12Kq6Q7UgG0NwuMuy2KLX9eKEV-XpQQIHwHqY.jar
    Sep 10, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3445344613483535286.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-it6_M6HqRPgDuwLd701veraidtRKarLwKzLVaHJyHXo.jar
    Sep 10, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 10, 2021 12:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104003 bytes, hash 2172ed740860633236368d35a3344a3146d311ca4e57b47c5184859978358c55> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IXLtdAhgYzI2No01ozRKMUbTEcpOV7R8UYSFmXg1jFU.pb
    Sep 10, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 10, 2021 12:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 10, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-10_05_45_33-7298934043500527143?project=apache-beam-testing
    Sep 10, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-10_05_45_33-7298934043500527143
    Sep 10, 2021 12:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-10_05_45_33-7298934043500527143
    Sep 10, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-10T12:45:37.036Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:45:42.646Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 10, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:45:43.478Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 10, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:45:43.514Z: Expanding GroupByKey operations into optimizable parts.
    Sep 10, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:45:43.551Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 10, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:45:43.642Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 10, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:45:43.681Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 10, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:45:43.708Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 10, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:45:44.189Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 10, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:45:44.623Z: Starting 5 workers in us-central1-c...
    Sep 10, 2021 12:46:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:46:14.454Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:46:19.012Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 10, 2021 12:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:46:19.045Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 10, 2021 12:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:46:29.311Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 10, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:46:55.491Z: Workers have started successfully.
    Sep 10, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:46:55.512Z: Workers have started successfully.
    Sep 10, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:47:28.475Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 10, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:47:28.631Z: Cleaning up.
    Sep 10, 2021 12:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:47:28.719Z: Stopping worker pool...
    Sep 10, 2021 12:49:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:49:42.673Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 10, 2021 12:49:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T12:49:42.708Z: Worker pool stopped.
    Sep 10, 2021 12:49:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-10_05_45_33-7298934043500527143 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5c3e76f6-6a83-4a74-b1cc-1a4147eb5659 and timestamp: 2021-09-10T12:49:49.167000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.999

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2021 12:49:49 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 35.391 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 29s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/o657fyczukgiq

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2407

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2407/display/redirect?page=changes>

Changes:

[noreply] [BEAM-5097] Increment counter for "small words" in go SDK example


------------------------------------------
[...truncated 341.12 KB...]
Gradle Test Executor 2 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is a9f5d2013a02d9d2f955346998a3c933
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 10, 2021 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 10, 2021 6:45:30 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 10, 2021 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 10, 2021 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 6:45:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 6:45:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 6:45:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 6:45:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@213658433]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2038704487]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 6:45:36 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2021 6:45:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 10, 2021 6:45:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2021 6:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2021 6:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ucAidJrqxK96xuFPkyoKLCyoB4CvF-dHe3bPkncTQpU.jar
    Sep 10, 2021 6:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3787861847279080959.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-tCKlsxRDPI6hGVkUhmUCo3kAJe0jNM6i2QbV-S2rxzk.jar
    Sep 10, 2021 6:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 10, 2021 6:45:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2021 6:45:41 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104002 bytes, hash 36db5eb376d153c028f6969b0a525a168f9337732b4521a030613fd662cd1b8f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Nttes3bRU8Ao9pabClJaFo-TN3MrRSGgMGE_1mLNG48.pb
    Sep 10, 2021 6:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2021 6:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 10, 2021 6:45:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 10, 2021 6:45:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 10, 2021 6:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-09_23_45_44-6102199205758719663?project=apache-beam-testing
    Sep 10, 2021 6:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-09_23_45_44-6102199205758719663
    Sep 10, 2021 6:45:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-09_23_45_44-6102199205758719663
    Sep 10, 2021 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-10T06:45:48.778Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:46:57.003Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 10, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:46:57.783Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 10, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:46:57.827Z: Expanding GroupByKey operations into optimizable parts.
    Sep 10, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:46:57.869Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 10, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:46:57.960Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 10, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:46:57.991Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 10, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:46:58.023Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 10, 2021 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:46:58.501Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 10, 2021 6:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:46:58.602Z: Starting 5 workers in us-central1-c...
    Sep 10, 2021 6:47:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:47:08.308Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2021 6:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:47:33.516Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 10, 2021 6:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:47:33.546Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 10, 2021 6:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:47:43.831Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 10, 2021 6:48:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:48:07.331Z: Workers have started successfully.
    Sep 10, 2021 6:48:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:48:07.422Z: Workers have started successfully.
    Sep 10, 2021 6:48:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:48:34.915Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 10, 2021 6:48:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:48:35.453Z: Cleaning up.
    Sep 10, 2021 6:48:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:48:35.572Z: Stopping worker pool...
    Sep 10, 2021 6:51:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:51:01.913Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 10, 2021 6:51:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T06:51:01.986Z: Worker pool stopped.
    Sep 10, 2021 6:51:07 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-09_23_45_44-6102199205758719663 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3010cbc0-0ae3-4de1-9687-671292c97ac1 and timestamp: 2021-09-10T06:51:07.724000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.235

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2021 6:51:08 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 5 mins 43.06 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 44s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/ckteho7wqctmy

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2406

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2406/display/redirect?page=changes>

Changes:

[ruwan.lambrichts] Clarify additional_bq_parameters argument

[noreply] Fix broken 'differences from pandas' link

[noreply] Added GroupBy row in Aggregation table.

[Etienne Chauchot] [BEAM-5172] Temporary ignore testSplit and testSizes tests waiting for a

[samuelw] [BEAM-12740] Remove matching to filter files when renaming gcs files in

[noreply] [BEAM-3304] Helper functions for triggers (#15430)

[esert] Bump a throttling counter on BigQueryRead retries due to


------------------------------------------
[...truncated 346.51 KB...]
Caching disabled for task ':sdks:java:extensions:sql:testJar' because:
  Caching has not been enabled for the task
Task ':sdks:java:extensions:sql:testJar' is not up-to-date because:
  No history is available.
:sdks:java:extensions:sql:testJar (Thread[Daemon worker,5,main]) completed. Took 0.133 secs.
:sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Daemon worker,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:compileTestJava
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:compileTestJava'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is 8b54be572e3a891ec50ae23f0cd874b6
Task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task ':sdks:java:extensions:sql:perf-tests:compileTestJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with JDK Java compiler API.
Created classpath snapshot for incremental compilation in 0.125 secs. 3324 duplicate classes found in classpath (see all with --debug).
Stored cache entry for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' with cache key 8b54be572e3a891ec50ae23f0cd874b6
:sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Daemon worker,5,main]) completed. Took 1.934 secs.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) started.

> Task :sdks:java:extensions:sql:perf-tests:testClasses
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) started.
Gradle Test Executor 4 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 52158865a6a1134246d7f0cd77fcdbe5
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 4'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 4'
Successfully started process 'Gradle Test Executor 4'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 10, 2021 12:46:32 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 10, 2021 12:46:33 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 10, 2021 12:46:34 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 10, 2021 12:46:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 12:46:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:46:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 12:46:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 12:46:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:46:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 10, 2021 12:46:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 10, 2021 12:46:44 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 10, 2021 12:46:44 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ucAidJrqxK96xuFPkyoKLCyoB4CvF-dHe3bPkncTQpU.jar
    Sep 10, 2021 12:46:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3301184806270174235.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-6EXH181S_JuJL-rLXn0F5Zi7ecJnf0DFyDuy8-YYA14.jar
    Sep 10, 2021 12:46:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 10, 2021 12:46:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 10, 2021 12:46:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104003 bytes, hash 286996d6b9ece45dc905ba2aa66565d4ecb08fdf6684b03495788253c30363d0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KGmW1rns5F3JBboqpmVl1Oywj99mhLA0lXiCU8MDY9A.pb
    Sep 10, 2021 12:46:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 10, 2021 12:46:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 10, 2021 12:46:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 10, 2021 12:46:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 10, 2021 12:46:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-09_17_46_48-13252814087671969139?project=apache-beam-testing
    Sep 10, 2021 12:46:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-09_17_46_48-13252814087671969139
    Sep 10, 2021 12:46:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-09_17_46_48-13252814087671969139
    Sep 10, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-10T00:46:52.073Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 10, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T00:46:59.051Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 10, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2021-09-10T00:47:00.326Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/19495 instances, 2/0 CPUs, 25/247716 disk GB, 0/2397 SSD disk GB, 1/272 instance groups, 1/275 managed instance groups, 1/501 instance templates, 1/724 in-use IP addresses.

    Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
    Sep 10, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T00:47:00.409Z: Cleaning up.
    Sep 10, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T00:47:00.504Z: Worker pool stopped.
    Sep 10, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-10T00:47:01.762Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 10, 2021 12:47:06 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-09_17_46_48-13252814087671969139 failed with status FAILED.
    Sep 10, 2021 12:47:06 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
    Sep 10, 2021 12:47:06 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): db9d6a6b-3f3e-419b-a5af-88970102aeda and timestamp: 2021-09-10T00:47:06.309000000Z:
                     Metric:                    Value:
                   read_time                       0.0
                 fields_read                      -1.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 10, 2021 12:47:06 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 38.058 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 44s
152 actionable tasks: 104 executed, 48 from cache

Publishing build scan...
https://gradle.com/s/twy4oukgudnbu

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2405

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2405/display/redirect?page=changes>

Changes:

[kawaigin] [BEAM-10708] Support streaming cache in beam_sql magic

[Luke Cwik] [BEAM-12769] Fix typo in test class name, CLass -> Class


------------------------------------------
[...truncated 337.77 KB...]
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is f2b763118ffad8fa1c25bbf167741317
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 09, 2021 6:44:55 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 09, 2021 6:44:56 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 09, 2021 6:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 09, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1317539897]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 09, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1789235794]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 09, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-E-KPx8odQzZ4kFtSdYrJ4EpS3UhAdfKiSlspRWHWbTM.jar
    Sep 09, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4032344636116975973.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Otr6YtIn8OvieoMlgDZEcUCuafRgZZmHLZEb8Q5U9Mw.jar
    Sep 09, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 09, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104003 bytes, hash 8f9d6106d50493f00fb68aae6b8182fc5013ed9896169621264f26680e5ae6f0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-j51hBtUEk_APtoqua4GC_FAT7ZiWFpYhJk8maA5a5vA.pb
    Sep 09, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 09, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 09, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 09, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-09_11_45_09-9517891140804248230?project=apache-beam-testing
    Sep 09, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-09_11_45_09-9517891140804248230
    Sep 09, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-09_11_45_09-9517891140804248230
    Sep 09, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-09T18:45:13.188Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:20.062Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 09, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:21.005Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:21.048Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:21.109Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:21.187Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:21.220Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:21.261Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 09, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:21.660Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 09, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:21.747Z: Starting 5 workers in us-central1-c...
    Sep 09, 2021 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:44.295Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:56.448Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 09, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:45:56.470Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 09, 2021 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:46:06.723Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 09, 2021 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:46:30.417Z: Workers have started successfully.
    Sep 09, 2021 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:46:30.444Z: Workers have started successfully.
    Sep 09, 2021 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:47:00.619Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 09, 2021 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:47:00.836Z: Cleaning up.
    Sep 09, 2021 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:47:00.962Z: Stopping worker pool...
    Sep 09, 2021 6:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:49:22.787Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 09, 2021 6:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T18:49:22.827Z: Worker pool stopped.
    Sep 09, 2021 6:49:37 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-09_11_45_09-9517891140804248230 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f0c2774c-b07d-468c-a595-fad1d2771713 and timestamp: 2021-09-09T18:49:37.316000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.656

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2021 6:49:37 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 46.536 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 19s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/a4hzz53d6tcxk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2404

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2404/display/redirect>

Changes:


------------------------------------------
[...truncated 338.28 KB...]

> Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions.
:sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest
Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is f2b763118ffad8fa1c25bbf167741317
Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 09, 2021 12:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 09, 2021 12:44:57 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 09, 2021 12:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 09, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 12:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1317539897]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 09, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1789235794]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 09, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-E-KPx8odQzZ4kFtSdYrJ4EpS3UhAdfKiSlspRWHWbTM.jar
    Sep 09, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8725071349581070517.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zNV2c3gwees571su6TyCHDv9tDoPuabHg473JHoE_Ds.jar
    Sep 09, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds
    Sep 09, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104003 bytes, hash afa33cda3f3427844c94243b0ba3a45c77c2e7c3b3e47ff7bf2d687657b585d8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-r6M82j80J4RMlCQ7C6OkXHfC58Oz5H_3vy1odle1hdg.pb
    Sep 09, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 09, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 09, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 09, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-09_05_45_12-5706833399107965069?project=apache-beam-testing
    Sep 09, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-09_05_45_12-5706833399107965069
    Sep 09, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-09_05_45_12-5706833399107965069
    Sep 09, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-09T12:45:15.804Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:45:23.877Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 09, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:45:24.701Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:45:24.750Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:45:24.782Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:45:24.849Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:45:24.891Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:45:24.923Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 09, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:45:25.434Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 09, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:45:25.538Z: Starting 5 workers in us-central1-c...
    Sep 09, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:45:43.483Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2021 12:46:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:46:10.768Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 09, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:46:35.562Z: Workers have started successfully.
    Sep 09, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:46:35.643Z: Workers have started successfully.
    Sep 09, 2021 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:47:07.055Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 09, 2021 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:47:07.194Z: Cleaning up.
    Sep 09, 2021 12:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:47:07.266Z: Stopping worker pool...
    Sep 09, 2021 12:49:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:49:26.442Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 09, 2021 12:49:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T12:49:26.486Z: Worker pool stopped.
    Sep 09, 2021 12:49:33 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-09_05_45_12-5706833399107965069 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2040bd2f-fac5-432f-8d03-598d69026ea5 and timestamp: 2021-09-09T12:49:33.092000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.121

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2021 12:49:33 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 41.233 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 15s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/tj2hbwwkb4ssi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2403

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2403/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #15480: [BEAM-12356] Make sure DatasetService is

[noreply] [BEAM-11981] Java Bigtable - Implement IO Request Count metrics (#15342)

[noreply] [BEAM-12834] Improve Go SDK cross-language documentation and API.


------------------------------------------
[...truncated 360.87 KB...]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR
    Sep 09, 2021 6:48:37 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage.
    Sep 09, 2021 6:48:38 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 09, 2021 6:48:39 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged.
    Sep 09, 2021 6:48:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 6:48:41 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:48:42 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2021 6:48:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 6:48:43 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:48:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2021 6:48:43 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1317539897]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1789235794]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 6:48:44 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2021 6:48:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2021 6:48:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 09, 2021 6:48:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2021 6:48:48 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2021 6:48:48 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-E-KPx8odQzZ4kFtSdYrJ4EpS3UhAdfKiSlspRWHWbTM.jar
    Sep 09, 2021 6:48:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4102403725535759715.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-YuEt9d4aY18rG-i3uAL0CA2cr-0Z5YBkJMJ3DvNaBa8.jar
    Sep 09, 2021 6:48:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.34.0-SNAPSHOT-xz-aSlhizJi8jigGo3amZ2g_MkglE3dBulicsi2UX5U.jar
    Sep 09, 2021 6:48:48 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/thrift/build/libs/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT-tests-I-osQrG-lSEES6RD4ibuovIeVl-_U3nOEEsBLo1rYrA.jar
    Sep 09, 2021 6:48:49 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/thrift/build/libs/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT-4bF8QpaxkWlkdPssX5OV0S9rIzFuRU0pojrN2fBwjjk.jar
    Sep 09, 2021 6:48:49 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/udf/build/libs/beam-sdks-java-extensions-sql-udf-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-udf-2.34.0-SNAPSHOT-MlQQAS1Je1cH4lXtIOKM3zL6o97S5F7sJQ3zUailw08.jar
    Sep 09, 2021 6:48:49 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 245 files cached, 5 files newly uploaded in 0 seconds
    Sep 09, 2021 6:48:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2021 6:48:49 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104003 bytes, hash 571bfc393a8d0f6b3a33b7d0823acafed0530022247a92f96ee8625f31fccaee> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Vxv8OTqND2s6M7fQgjrK_tBTACIkepL5buhiXzH8yu4.pb
    Sep 09, 2021 6:48:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2021 6:48:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 09, 2021 6:48:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 09, 2021 6:48:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 09, 2021 6:48:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-08_23_48_52-5219237452619354559?project=apache-beam-testing
    Sep 09, 2021 6:48:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-08_23_48_52-5219237452619354559
    Sep 09, 2021 6:48:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-08_23_48_52-5219237452619354559
    Sep 09, 2021 6:48:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-09T06:48:55.523Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2021 6:49:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:01.268Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 09, 2021 6:49:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:01.914Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2021 6:49:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:01.974Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2021 6:49:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:02.007Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2021 6:49:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:02.082Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2021 6:49:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:02.110Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2021 6:49:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:02.131Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 09, 2021 6:49:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:02.500Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 09, 2021 6:49:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:02.580Z: Starting 5 workers in us-central1-a...
    Sep 09, 2021 6:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:12.854Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2021 6:49:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-09T06:49:39.926Z: Autoscaling: Startup of the worker pool in zone us-central1-a reached 2 workers, but the goal was 5 workers. The service will retry. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  Limit: 2000.0 in region us-central1.
    Sep 09, 2021 6:49:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:54.315Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 09, 2021 6:49:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:49:54.337Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 09, 2021 6:50:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:50:14.937Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 09, 2021 6:50:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:50:14.989Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 09, 2021 6:50:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:50:16.374Z: Workers have started successfully.
    Sep 09, 2021 6:50:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:50:16.412Z: Workers have started successfully.
    Sep 09, 2021 6:50:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:50:48.438Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 09, 2021 6:51:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:51:01.460Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 09, 2021 6:51:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:51:01.603Z: Cleaning up.
    Sep 09, 2021 6:51:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:51:01.681Z: Stopping worker pool...
    Sep 09, 2021 6:53:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:53:22.721Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 09, 2021 6:53:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T06:53:22.765Z: Worker pool stopped.
    Sep 09, 2021 6:53:29 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-08_23_48_52-5219237452619354559 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b4a6ad67-4b8d-4500-b63d-649a938374a1 and timestamp: 2021-09-09T06:53:29.738000000Z:
                     Metric:                    Value:
                   read_time                     22.55
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2021 6:53:30 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 6 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 58.44 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 10s
152 actionable tasks: 112 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/42gyfcvvqbb7y

Stopped 5 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2402

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2402/display/redirect>

Changes:


------------------------------------------
[...truncated 351.02 KB...]


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@138530218]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2021 12:45:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 12:45:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 12:45:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 09, 2021 12:45:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 09, 2021 12:45:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 09, 2021 12:45:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 09, 2021 12:45:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 09, 2021 12:45:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 09, 2021 12:45:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 09, 2021 12:45:49 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 09, 2021 12:45:49 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-avG4QDSZYig5aaU1dJDVNXG4awWsT7XhuC5e-Ob-kdc.jar
    Sep 09, 2021 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3013290146853345367.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-LVPnf-GhY1mCzwb8HXhG26nUAADpCXnjI97UN5oq5qA.jar
    Sep 09, 2021 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.12.7/1f45c6a397feeb46da75425619333d1cc6f90f78/mongo-java-driver-3.12.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.12.7-D_zgBJWNb9mzmuetJ37a0X9XtpcfSGsXYpxe6eE8Tao.jar
    Sep 09, 2021 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 248 files cached, 2 files newly uploaded in 0 seconds
    Sep 09, 2021 12:45:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 09, 2021 12:45:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104003 bytes, hash 2a665889f596ef4ef416cec14c0d8590b2ea4de37720055c36172e588687465d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KmZYifWW7070Fs7BTA2FkLLqTeN3IAVcNhcuWIaHRl0.pb
    Sep 09, 2021 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 09, 2021 12:45:53 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:625)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:264)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:383)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1900(InstantiatingGrpcChannelProvider.java:82)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:239)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:249)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:205)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:136)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 09, 2021 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 09, 2021 12:45:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 09, 2021 12:45:54 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 09, 2021 12:45:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-08_17_45_54-9044668124243813912?project=apache-beam-testing
    Sep 09, 2021 12:45:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-08_17_45_54-9044668124243813912
    Sep 09, 2021 12:45:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-08_17_45_54-9044668124243813912
    Sep 09, 2021 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-09T00:45:57.759Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 09, 2021 12:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:46:04.041Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 09, 2021 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:46:05.024Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 09, 2021 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:46:05.055Z: Expanding GroupByKey operations into optimizable parts.
    Sep 09, 2021 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:46:05.082Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 09, 2021 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:46:05.133Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 09, 2021 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:46:05.160Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 09, 2021 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:46:05.184Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 09, 2021 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:46:05.548Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 09, 2021 12:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:46:05.638Z: Starting 5 workers in us-central1-a...
    Sep 09, 2021 12:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:46:13.056Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 09, 2021 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:46:49.273Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 09, 2021 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:47:14.824Z: Workers have started successfully.
    Sep 09, 2021 12:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:47:14.848Z: Workers have started successfully.
    Sep 09, 2021 12:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:47:43.436Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 09, 2021 12:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:47:43.578Z: Cleaning up.
    Sep 09, 2021 12:47:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:47:43.688Z: Stopping worker pool...
    Sep 09, 2021 12:50:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:50:06.553Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 09, 2021 12:50:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-09T00:50:06.606Z: Worker pool stopped.
    Sep 09, 2021 12:50:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-08_17_45_54-9044668124243813912 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3f64b9b3-c500-4886-ac1e-9cc5e6bcd348 and timestamp: 2021-09-09T00:50:13.488000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.131

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 09, 2021 12:50:14 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.043 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.048 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 40.737 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 52s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/kzmzobvcrsykc

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2401

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2401/display/redirect?page=changes>

Changes:

[suztomo] [BEAM-11205] Upgrading the Libraries BOM to v22

[randomstep] [BEAM-12708] Bump arrow-memory-netty

[Etienne Chauchot] [BEAM-12153] implement GroupByKey with CombinePerKey with Concatenate

[Etienne Chauchot] [BEAM-11023] Increase memory in SS Validates runner tests to avoid OOM

[vincent.marquez] [BEAM-9008] adds CassandraIO.readAll

[Etienne Chauchot] [BEAM-12727] extract Concatenate CombineFn to runner-core module to

[noreply] Add display data for JdbcIO.write (#15460)


------------------------------------------
[...truncated 368.90 KB...]


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@731892915]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2021 6:58:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2021 6:58:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2021 6:58:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2021 6:58:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2021 6:58:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2021 6:58:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2021 6:58:35 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 08, 2021 6:58:35 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 08, 2021 6:58:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 08, 2021 6:58:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 08, 2021 6:58:41 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-avG4QDSZYig5aaU1dJDVNXG4awWsT7XhuC5e-Ob-kdc.jar
    Sep 08, 2021 6:58:43 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8441147360364468252.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kwkXuvBrNImCSQycrQ6mtyejOx2Nxfo-AMJPN_XyHOk.jar
    Sep 08, 2021 6:58:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-bigtable-emulator/0.137.1/b1639aa134de1302e43d9e9c3843f6ff853c510f/google-cloud-bigtable-emulator-0.137.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-bigtable-emulator-0.137.1-V2SlLU4BjIGf5QoF7YL-A-QNjw49NdXrNM4QoX9MXSI.jar
    Sep 08, 2021 6:58:47 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 248 files cached, 2 files newly uploaded in 5 seconds
    Sep 08, 2021 6:58:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 08, 2021 6:58:47 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <104000 bytes, hash a3f86994385fa585e082d4f97298ce1851b14fdef302fcb3efbc0cbff0caafb6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-o_hplDhfpYXggtT5cpjOGFGxT97zAvyz77wMv_DKr7Y.pb
    Sep 08, 2021 6:58:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 08, 2021 6:58:52 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:625)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:264)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:383)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1900(InstantiatingGrpcChannelProvider.java:82)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:239)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:249)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:205)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:136)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 08, 2021 6:58:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 08, 2021 6:58:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 08, 2021 6:58:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 08, 2021 6:58:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-08_11_58_53-12780752956074070003?project=apache-beam-testing
    Sep 08, 2021 6:58:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-08_11_58_53-12780752956074070003
    Sep 08, 2021 6:58:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-08_11_58_53-12780752956074070003
    Sep 08, 2021 6:58:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-08T18:58:56.367Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 08, 2021 6:59:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T18:59:03.058Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 08, 2021 6:59:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T18:59:03.807Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 08, 2021 6:59:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T18:59:03.839Z: Expanding GroupByKey operations into optimizable parts.
    Sep 08, 2021 6:59:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T18:59:03.866Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 08, 2021 6:59:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T18:59:03.923Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 08, 2021 6:59:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T18:59:03.951Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 08, 2021 6:59:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T18:59:03.974Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 08, 2021 6:59:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T18:59:04.312Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 08, 2021 6:59:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T18:59:04.378Z: Starting 5 workers in us-central1-c...
    Sep 08, 2021 6:59:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T18:59:27.905Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 08, 2021 6:59:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T18:59:54.430Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 08, 2021 7:00:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T19:00:20.630Z: Workers have started successfully.
    Sep 08, 2021 7:00:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T19:00:20.660Z: Workers have started successfully.
    Sep 08, 2021 7:00:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T19:00:53.302Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 08, 2021 7:00:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T19:00:53.423Z: Cleaning up.
    Sep 08, 2021 7:00:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T19:00:53.524Z: Stopping worker pool...
    Sep 08, 2021 7:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T19:03:09.283Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 08, 2021 7:03:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T19:03:09.321Z: Worker pool stopped.
    Sep 08, 2021 7:03:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-08_11_58_53-12780752956074070003 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a5d1b156-b9b7-4405-a7cb-07986c614330 and timestamp: 2021-09-08T19:03:16.708000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.564

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2021 7:03:19 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 6 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.05 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.068 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 5 mins 10.973 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 13m 50s
152 actionable tasks: 112 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/i2xjblbztihtu

Stopped 5 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2400

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2400/display/redirect>

Changes:


------------------------------------------
[...truncated 349.13 KB...]
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@286566052]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 08, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 08, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 08, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 08, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-kyLIsdCfiZiuGlhZEOOvAXu4J-2BxQlikUE0jy6y3CM.jar
    Sep 08, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2753128918288298644.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ZwZnAmef4mg8h099EhDWRGx0TSv7U6bV1H6cboSZGoI.jar
    Sep 08, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 08, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 08, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103369 bytes, hash b6fa60a944a0830af17173de67c8f60d1b043fd0a91d3e66eb48f5b48f3dc7d2> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tvpgqUSggwrxcXPeZ8j2DRsEP9CpHT5m60j1tI89x9I.pb
    Sep 08, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 08, 2021 12:45:25 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 08, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 08, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 08, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 08, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-08_05_45_25-3610861757899371753?project=apache-beam-testing
    Sep 08, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-08_05_45_25-3610861757899371753
    Sep 08, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-08_05_45_25-3610861757899371753
    Sep 08, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-08T12:45:29.368Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:45:36.258Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 08, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:45:37.177Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 08, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:45:37.209Z: Expanding GroupByKey operations into optimizable parts.
    Sep 08, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:45:37.232Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 08, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:45:37.294Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 08, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:45:37.326Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 08, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:45:37.343Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 08, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:45:37.667Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 08, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:45:37.741Z: Starting 5 workers in us-central1-c...
    Sep 08, 2021 12:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:46:08.261Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 08, 2021 12:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:46:22.755Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 08, 2021 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:46:49.150Z: Workers have started successfully.
    Sep 08, 2021 12:46:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:46:49.178Z: Workers have started successfully.
    Sep 08, 2021 12:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:47:20.678Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 08, 2021 12:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:47:20.813Z: Cleaning up.
    Sep 08, 2021 12:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:47:20.893Z: Stopping worker pool...
    Sep 08, 2021 12:49:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:49:43.577Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 08, 2021 12:49:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T12:49:43.639Z: Worker pool stopped.
    Sep 08, 2021 12:49:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-08_05_45_25-3610861757899371753 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8618766b-d55a-4471-b02a-2c84ef79a3dc and timestamp: 2021-09-08T12:49:51.180000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.474

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2021 12:49:52 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 44.747 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 31s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/6rvgr2bxmnsu4

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2399

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2399/display/redirect>

Changes:


------------------------------------------
[...truncated 346.62 KB...]
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@70318993]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 08, 2021 6:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 08, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 08, 2021 6:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 08, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-kyLIsdCfiZiuGlhZEOOvAXu4J-2BxQlikUE0jy6y3CM.jar
    Sep 08, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6782132080445812658.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-3QbjJJHqMxEriTCAXV-88kluvgS1-_SZ-WhihzXaVAg.jar
    Sep 08, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 08, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 08, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103368 bytes, hash 727c0de372d67e79965752641d5bdf47064c52ac4ff02dba3dfbaef8fb6eacda> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cnwN43LWfnmWV1JkHVvfRwZMUqxP8C26Pfuu-PturNo.pb
    Sep 08, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 08, 2021 6:45:13 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 08, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 08, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 08, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 08, 2021 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-07_23_45_13-1903189651028351943?project=apache-beam-testing
    Sep 08, 2021 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-07_23_45_13-1903189651028351943
    Sep 08, 2021 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-07_23_45_13-1903189651028351943
    Sep 08, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-08T06:45:16.995Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 08, 2021 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:45:24.106Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 08, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:45:25.167Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 08, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:45:25.201Z: Expanding GroupByKey operations into optimizable parts.
    Sep 08, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:45:25.227Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 08, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:45:25.289Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 08, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:45:25.317Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 08, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:45:25.342Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 08, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:45:25.691Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 08, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:45:25.752Z: Starting 5 workers in us-central1-c...
    Sep 08, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:45:28.187Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 08, 2021 6:46:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:46:17.089Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 08, 2021 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:46:42.128Z: Workers have started successfully.
    Sep 08, 2021 6:46:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:46:42.154Z: Workers have started successfully.
    Sep 08, 2021 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:47:11.586Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 08, 2021 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:47:11.768Z: Cleaning up.
    Sep 08, 2021 6:47:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:47:11.838Z: Stopping worker pool...
    Sep 08, 2021 6:49:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:49:28.338Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 08, 2021 6:49:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T06:49:28.379Z: Worker pool stopped.
    Sep 08, 2021 6:49:37 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-07_23_45_13-1903189651028351943 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0a72177d-94e9-4644-980e-e86303c8f568 and timestamp: 2021-09-08T06:49:38.041000000Z:
                     Metric:                    Value:
                   read_time                     8.003
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2021 6:49:38 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.042 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 44.315 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 18s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/udz3vbqr3rp7c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2398

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2398/display/redirect?page=changes>

Changes:

[Valentyn Tymofieiev] Disable Kafka perf tests.

[Andrew Pilloud] [BEAM-12850] Calcite drops empty Calc now

[Andrew Pilloud] [BEAM-12853] VALUES produces a UNION, window can't be set afterwards

[Andrew Pilloud] [BEAM-12852] Revert BigTable changes, just cast to bigint

[Andrew Pilloud] [BEAM-12851] Map output table names

[Luke Cwik] [BEAM-12802] Define a prefetchable iterator and iterable and utility


------------------------------------------
[...truncated 355.76 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@286566052]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2021 12:47:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2021 12:47:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2021 12:47:49 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 08, 2021 12:47:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 08, 2021 12:47:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 08, 2021 12:47:49 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 08, 2021 12:47:50 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 08, 2021 12:47:50 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 08, 2021 12:47:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 08, 2021 12:47:55 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 08, 2021 12:47:55 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-kyLIsdCfiZiuGlhZEOOvAXu4J-2BxQlikUE0jy6y3CM.jar
    Sep 08, 2021 12:47:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests-bAwx5NkMSLPDxr3-g_qrdxYXMRzjFUW6mzgaBETsNRQ.jar
    Sep 08, 2021 12:47:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7694684679123257764.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-4kZRj0umaF3cn77hcmTm-b2-W2GGBGXXh5yslarxhuU.jar
    Sep 08, 2021 12:47:55 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-JWvpue78M5Rry8dtDE_bEvYq8OTAlYmyChunpeH4wGc.jar
    Sep 08, 2021 12:47:56 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 245 files cached, 3 files newly uploaded in 1 seconds
    Sep 08, 2021 12:47:56 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 08, 2021 12:47:56 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103368 bytes, hash ea842bec9a6fbdfa2a693d1026a3bf9c1c0b75d144afe2fe74045b873344979b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-6oQr7JpvvfoqaT0QJqO_nBwLddFEr-L-dARbhzNEl5s.pb
    Sep 08, 2021 12:47:58 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 08, 2021 12:47:58 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 08, 2021 12:47:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 08, 2021 12:47:59 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 08, 2021 12:47:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 08, 2021 12:48:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-07_17_47_59-4372446111064550155?project=apache-beam-testing
    Sep 08, 2021 12:48:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-07_17_47_59-4372446111064550155
    Sep 08, 2021 12:48:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-07_17_47_59-4372446111064550155
    Sep 08, 2021 12:48:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-08T00:48:02.842Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 08, 2021 12:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:48:08.465Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 08, 2021 12:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:48:09.129Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 08, 2021 12:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:48:09.160Z: Expanding GroupByKey operations into optimizable parts.
    Sep 08, 2021 12:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:48:09.179Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 08, 2021 12:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:48:09.250Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 08, 2021 12:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:48:09.278Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 08, 2021 12:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:48:09.323Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 08, 2021 12:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:48:09.677Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 08, 2021 12:48:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:48:09.762Z: Starting 5 workers in us-central1-a...
    Sep 08, 2021 12:48:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:48:29.454Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 08, 2021 12:48:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:48:57.990Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 08, 2021 12:49:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:49:23.176Z: Workers have started successfully.
    Sep 08, 2021 12:49:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:49:23.202Z: Workers have started successfully.
    Sep 08, 2021 12:49:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:49:50.630Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 08, 2021 12:49:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:49:50.772Z: Cleaning up.
    Sep 08, 2021 12:49:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:49:50.867Z: Stopping worker pool...
    Sep 08, 2021 12:53:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:53:07.077Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 08, 2021 12:53:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-08T00:53:07.129Z: Worker pool stopped.
    Sep 08, 2021 12:53:12 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-07_17_47_59-4372446111064550155 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7573067b-04dd-4f90-9db7-0e5e466713b0 and timestamp: 2021-09-08T00:53:12.536000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.608

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 08, 2021 12:53:13 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.038 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 5 mins 35.27 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 51s
152 actionable tasks: 103 executed, 49 from cache

Publishing build scan...
https://gradle.com/s/6224axtaqz6a6

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2397

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2397/display/redirect?page=changes>

Changes:

[heejong] [BEAM-12838] Update artifact local path for DataflowRunner Java

[kwu] [BEAM-12828] Convert UseTestStream tests to use Long instead of Integer

[kwu] Apply SpotlessJava

[kwu] Apply SpotlessJava

[aydar.zaynutdinov] [BEAM-3385] Add requires about `equals()` and `hashMethod()` to

[aydar.zaynutdinov] [BEAM-3385] Changes regarding spotlessApply task

[noreply] Update runners/flink/job-server/flink_job_server.gradle

[heejong] separate into resolveArtifacts method

[heejong] add test

[aydar.zaynutdinov] [BEAM-3385] wrap up equals() and hashCode() methods into links

[heejong] update

[heejong] fix formatting


------------------------------------------
[...truncated 350.90 KB...]
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@286566052]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2021 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2021 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2021 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 07, 2021 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2021 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2021 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 07, 2021 6:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 07, 2021 6:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 07, 2021 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 07, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 07, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 07, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test677474641367482438.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-PCNOItkRmdHNGWcFxbPh6_6KIYojTFkaL1iXMhDmgEM.jar
    Sep 07, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 07, 2021 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 07, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <103368 bytes, hash 0452d91d9f41014011d638112db710717e22caf06b606dff5d4ed3dc84f5bd15> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BFLZHZ9BAUAR1jgRLbcQcX4iyvBrYG3_XU7T3IT1vRU.pb
    Sep 07, 2021 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 07, 2021 6:45:29 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 07, 2021 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 07, 2021 6:45:29 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 07, 2021 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 07, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-07_11_45_29-4344153523175646952?project=apache-beam-testing
    Sep 07, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-07_11_45_29-4344153523175646952
    Sep 07, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-07_11_45_29-4344153523175646952
    Sep 07, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-07T18:45:36.955Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 07, 2021 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:45:51.312Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 07, 2021 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:45:52.366Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 07, 2021 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:45:52.483Z: Expanding GroupByKey operations into optimizable parts.
    Sep 07, 2021 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:45:52.598Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 07, 2021 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:45:52.705Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 07, 2021 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:45:52.740Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 07, 2021 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:45:52.771Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 07, 2021 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:45:53.759Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 07, 2021 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:45:53.829Z: Starting 5 workers in us-central1-c...
    Sep 07, 2021 6:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:46:18.527Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 07, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:46:33.503Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 07, 2021 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:46:59.724Z: Workers have started successfully.
    Sep 07, 2021 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:46:59.751Z: Workers have started successfully.
    Sep 07, 2021 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:47:30.754Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 07, 2021 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:47:30.908Z: Cleaning up.
    Sep 07, 2021 6:47:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:47:31.293Z: Stopping worker pool...
    Sep 07, 2021 6:49:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:49:54.905Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 07, 2021 6:49:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T18:49:54.942Z: Worker pool stopped.
    Sep 07, 2021 6:50:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-07_11_45_29-4344153523175646952 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 99bbca5a-af68-4f92-b504-d6f7fe647007 and timestamp: 2021-09-07T18:50:03.195000000Z:
                     Metric:                    Value:
                   read_time                     7.883
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2021 6:50:04 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.07 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.079 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 52.557 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 45s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/qhbdxo4wujdvc

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2396

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2396/display/redirect>

Changes:


------------------------------------------
[...truncated 347.05 KB...]
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@286566052]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 07, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 07, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 07, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 07, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 07, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 07, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112460 bytes, hash 93a1e175b9b79d5c5e6ba2a376703e60fbed0c2390f0bd0722a1e609f1866607> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-k6Hhdbm3nVxea6KjdnA-YPvtDCOQ8L0HIqHmCfGGZgc.pb
    Sep 07, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 07, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 07, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5219788681739946429.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2nqE-IUzUHAXJkk_HoYMrpwFEMxoWZBym3zWkAA_pzc.jar
    Sep 07, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 07, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 07, 2021 12:45:10 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 07, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 07, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 07, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 07, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-07_05_45_11-849641508808864443?project=apache-beam-testing
    Sep 07, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-07_05_45_11-849641508808864443
    Sep 07, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-07_05_45_11-849641508808864443
    Sep 07, 2021 12:45:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-07T12:45:14.723Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 07, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:45:19.108Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 07, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:45:19.926Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 07, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:45:20.002Z: Expanding GroupByKey operations into optimizable parts.
    Sep 07, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:45:20.030Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 07, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:45:20.089Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 07, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:45:20.137Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 07, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:45:20.160Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 07, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:45:20.442Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 07, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:45:20.533Z: Starting 5 workers in us-central1-a...
    Sep 07, 2021 12:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:45:49.174Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 07, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:46:04.225Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 07, 2021 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:46:30.316Z: Workers have started successfully.
    Sep 07, 2021 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:46:30.341Z: Workers have started successfully.
    Sep 07, 2021 12:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:46:58.071Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 07, 2021 12:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:46:58.270Z: Cleaning up.
    Sep 07, 2021 12:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:46:58.373Z: Stopping worker pool...
    Sep 07, 2021 12:49:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:49:30.825Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 07, 2021 12:49:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T12:49:30.867Z: Worker pool stopped.
    Sep 07, 2021 12:49:37 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-07_05_45_11-849641508808864443 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d6f18021-03aa-4931-a589-e995715e5182 and timestamp: 2021-09-07T12:49:37.908000000Z:
                     Metric:                    Value:
                   read_time                     8.327
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2021 12:49:38 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 45.23 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 18s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/kd4ebeilufxjk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2395

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2395/display/redirect>

Changes:


------------------------------------------
[...truncated 347.71 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@70318993]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 07, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 07, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 07, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 07, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 07, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 07, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112460 bytes, hash 4bbcdd6a36c080ec0c85fe822d3dfe5865b704a77973fcac82c2e73bebd75e3b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-S7zdajbAgOwMhf6CLT3-WGW3BKd5c_ysgsLnO-vXXjs.pb
    Sep 07, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 07, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 07, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2888780114063650351.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-zl-Vnbm9g0Ghhf58H5zjyDulKiFYFCz0r4eF-40e_gM.jar
    Sep 07, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 07, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 07, 2021 6:45:11 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 07, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 07, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 07, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 07, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-06_23_45_11-12810481183632081909?project=apache-beam-testing
    Sep 07, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-06_23_45_11-12810481183632081909
    Sep 07, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-06_23_45_11-12810481183632081909
    Sep 07, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-07T06:45:15.050Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 07, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:20.086Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 07, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:20.829Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 07, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:20.868Z: Expanding GroupByKey operations into optimizable parts.
    Sep 07, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:20.896Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 07, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:20.960Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 07, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:20.986Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 07, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:21.031Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 07, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:21.335Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 07, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:21.396Z: Starting 5 workers in us-central1-a...
    Sep 07, 2021 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:42.037Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 07, 2021 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:59.908Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 07, 2021 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:45:59.939Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 07, 2021 6:46:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:46:10.320Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 07, 2021 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:46:32.048Z: Workers have started successfully.
    Sep 07, 2021 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:46:32.076Z: Workers have started successfully.
    Sep 07, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:46:59.907Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 07, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:47:00.036Z: Cleaning up.
    Sep 07, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:47:00.105Z: Stopping worker pool...
    Sep 07, 2021 6:49:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:49:25.974Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 07, 2021 6:49:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T06:49:26.009Z: Worker pool stopped.
    Sep 07, 2021 6:49:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-06_23_45_11-12810481183632081909 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): cdeb74cc-82cf-4b62-805d-6846bfe5148b and timestamp: 2021-09-07T06:49:32.801000000Z:
                     Metric:                    Value:
                   read_time                     9.354
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2021 6:49:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 40.237 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 13s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/4ntintwdoevpg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2394

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2394/display/redirect>

Changes:


------------------------------------------
[...truncated 347.62 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@286566052]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 07, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 07, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 07, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 07, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 07, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 07, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 07, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 07, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112461 bytes, hash a118467dd49ffcef4ecdea92d2796b3ae3476b4c3618ac6eb94942e75f7d8495> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-oRhGfdSf_O9OzeqS0nlrOuNHa0w2GKxuuUlC5199hJU.pb
    Sep 07, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 07, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6664303331593150101.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-8YoqFKtqp_alQIdoUI-5gFBc3xe7rODNpfG4KqeOblU.jar
    Sep 07, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 07, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 07, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 07, 2021 12:45:10 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 07, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 07, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 07, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 07, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-06_17_45_11-10672721731060003522?project=apache-beam-testing
    Sep 07, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-06_17_45_11-10672721731060003522
    Sep 07, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-06_17_45_11-10672721731060003522
    Sep 07, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-07T00:45:14.804Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 07, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:21.127Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 07, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:21.837Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 07, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:21.876Z: Expanding GroupByKey operations into optimizable parts.
    Sep 07, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:21.902Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 07, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:21.988Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 07, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:22.006Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 07, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:22.042Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 07, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:22.385Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 07, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:22.464Z: Starting 5 workers in us-central1-c...
    Sep 07, 2021 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:52.728Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 07, 2021 12:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:52.749Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 07, 2021 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:45:55.308Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 07, 2021 12:46:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:46:03.038Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 07, 2021 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:46:28.447Z: Workers have started successfully.
    Sep 07, 2021 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:46:28.486Z: Workers have started successfully.
    Sep 07, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:47:00.630Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 07, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:47:00.780Z: Cleaning up.
    Sep 07, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:47:00.860Z: Stopping worker pool...
    Sep 07, 2021 12:49:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:49:26.894Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 07, 2021 12:49:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-07T00:49:26.935Z: Worker pool stopped.
    Sep 07, 2021 12:49:35 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-06_17_45_11-10672721731060003522 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 449011c2-af89-4435-8473-3c2f4e5f462b and timestamp: 2021-09-07T00:49:35.071000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.047

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 07, 2021 12:49:35 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 42.078 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 15s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/sag6kpmy7pk6i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2393

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2393/display/redirect>

Changes:


------------------------------------------
[...truncated 346.90 KB...]
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@70318993]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 06, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 06, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 06, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 06, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112460 bytes, hash 711a43c9ae60ad1d7196fc713366ff4774f6028910ae161aa48fcf1167bf99a1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cRpDya5grR1xlvxxM2b_R3T2AokQrhYapI_PEWe_maE.pb
    Sep 06, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 06, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2113710927747881169.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-U7pVFP8_PvmAk8Chiic-7gdat8RRbpW7hHQpqwqcSiQ.jar
    Sep 06, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 06, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 06, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 06, 2021 6:45:11 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 06, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 06, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 06, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 06, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-06_11_45_12-233892037897958227?project=apache-beam-testing
    Sep 06, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-06_11_45_12-233892037897958227
    Sep 06, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-06_11_45_12-233892037897958227
    Sep 06, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-06T18:45:15.591Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 06, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:45:22.115Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 06, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:45:22.793Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 06, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:45:22.831Z: Expanding GroupByKey operations into optimizable parts.
    Sep 06, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:45:22.881Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 06, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:45:22.997Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 06, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:45:23.026Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 06, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:45:23.080Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 06, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:45:23.400Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 06, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:45:23.490Z: Starting 5 workers in us-central1-c...
    Sep 06, 2021 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:45:31.785Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 06, 2021 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:46:07.920Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 06, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:46:33.322Z: Workers have started successfully.
    Sep 06, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:46:33.345Z: Workers have started successfully.
    Sep 06, 2021 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:47:03.215Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 06, 2021 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:47:03.373Z: Cleaning up.
    Sep 06, 2021 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:47:03.445Z: Stopping worker pool...
    Sep 06, 2021 6:49:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:49:19.044Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 06, 2021 6:49:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T18:49:19.190Z: Worker pool stopped.
    Sep 06, 2021 6:49:28 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-06_11_45_12-233892037897958227 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 56182a06-0f10-4fde-83bd-92d0d27c6801 and timestamp: 2021-09-06T18:49:28.073000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.974

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2021 6:49:28 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 35.21 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 8s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/tmtihfpabalhg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2392

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2392/display/redirect>

Changes:


------------------------------------------
[...truncated 350.93 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2021 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2021 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2021 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2021 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2021 12:45:14 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2021 12:45:14 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2021 12:45:15 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 06, 2021 12:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 06, 2021 12:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 06, 2021 12:45:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 06, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112459 bytes, hash bda40eb7cf50cf89268623dcf58f7cb2d3c65a21c8c558b162a31126adb7059a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vaQOt89Qz4kmhiPc9Y98stPGWiHIxVixYqMRJq23BZo.pb
    Sep 06, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 06, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-flCHuvnqyUfqN7HJoR2_KDHap0ZeyuyWltYseI4uf_w.jar
    Sep 06, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 06, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5392601417418364659.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-_Kg7mctPBChh6eDCtHCN3NraU4acMGY59opu0ABAx0M.jar
    Sep 06, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.m2/repository/org/apache/beam/beam-vendor-grpc-1_36_0/0.2/beam-vendor-grpc-1_36_0-0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-ARJ9IUTlyXpbZnCRXyAjusfKQ3IZ_alLBvpCeiWl3YQ.jar
    Sep 06, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 245 files cached, 3 files newly uploaded in 1 seconds
    Sep 06, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 06, 2021 12:45:23 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 06, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 06, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 06, 2021 12:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 06, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-06_05_45_23-1807458484984207314?project=apache-beam-testing
    Sep 06, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-06_05_45_23-1807458484984207314
    Sep 06, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-06_05_45_23-1807458484984207314
    Sep 06, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-06T12:45:27.027Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 06, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:45:35.958Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:45:36.834Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:45:36.862Z: Expanding GroupByKey operations into optimizable parts.
    Sep 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:45:36.876Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:45:36.941Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:45:36.970Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:45:36.998Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:45:37.385Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 06, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:45:37.448Z: Starting 5 workers in us-central1-c...
    Sep 06, 2021 12:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:46:03.132Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 06, 2021 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:46:17.613Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 06, 2021 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:46:17.713Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 06, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:46:28.143Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 06, 2021 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:46:52.650Z: Workers have started successfully.
    Sep 06, 2021 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:46:52.681Z: Workers have started successfully.
    Sep 06, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:47:23.226Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 06, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:47:23.357Z: Cleaning up.
    Sep 06, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:47:23.427Z: Stopping worker pool...
    Sep 06, 2021 12:49:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:49:43.709Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 06, 2021 12:49:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T12:49:43.763Z: Worker pool stopped.
    Sep 06, 2021 12:49:49 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-06_05_45_23-1807458484984207314 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f1795ea7-0c86-46e9-bc81-eddab3de7b75 and timestamp: 2021-09-06T12:49:49.613000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.098

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2021 12:49:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
    WARNING: Missing property -- measurement/database. Metrics won't be published.

3 tests completed, 2 failed
Finished generating test XML results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 45.99 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 32s
152 actionable tasks: 99 executed, 53 from cache

Publishing build scan...
https://gradle.com/s/bzozrrtvrwltq

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2391

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2391/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #14927 from [BEAM-12400] MongoDBIO support for update


------------------------------------------
[...truncated 351.07 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@286566052]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 06, 2021 6:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 06, 2021 6:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 06, 2021 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 06, 2021 6:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112460 bytes, hash cccdde8303397054e696647b9b4abdf7f20fa5bb4811f6f546012c086472b75c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zM3egwM5cFTmlmR7m0q99_IPpbtIEfb1RgEsCGRyt1w.pb
    Sep 06, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 06, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 06, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3458559268242123816.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kueNokXYV5Y0IYBOcmqyfoPgFMmWGBezg9eOeTjXJYw.jar
    Sep 06, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT-h5Vm59YQQa-RcmHCCG09Iw6skDekxZqFrPsGtnvUHB4.jar
    Sep 06, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT-tests-Lg8Zg7us2s6VPtPB7f5WrJKFl5dhDd82wMfxmGkALkE.jar
    Sep 06, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 245 files cached, 3 files newly uploaded in 0 seconds
    Sep 06, 2021 6:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 06, 2021 6:45:32 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 06, 2021 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 06, 2021 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 06, 2021 6:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 06, 2021 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-05_23_45_33-10006167550902779687?project=apache-beam-testing
    Sep 06, 2021 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-05_23_45_33-10006167550902779687
    Sep 06, 2021 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-05_23_45_33-10006167550902779687
    Sep 06, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-06T06:45:36.879Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 06, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:45:41.637Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 06, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:45:42.407Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 06, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:45:42.467Z: Expanding GroupByKey operations into optimizable parts.
    Sep 06, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:45:42.483Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 06, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:45:42.546Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 06, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:45:42.573Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 06, 2021 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:45:42.606Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 06, 2021 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:45:42.918Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 06, 2021 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:45:42.985Z: Starting 5 workers in us-central1-a...
    Sep 06, 2021 6:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:46:04.687Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 06, 2021 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:46:31.967Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 06, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:46:58.287Z: Workers have started successfully.
    Sep 06, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:46:58.314Z: Workers have started successfully.
    Sep 06, 2021 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:47:26.479Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 06, 2021 6:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:47:26.612Z: Cleaning up.
    Sep 06, 2021 6:47:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:47:26.686Z: Stopping worker pool...
    Sep 06, 2021 6:49:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:49:46.103Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 06, 2021 6:49:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T06:49:46.136Z: Worker pool stopped.
    Sep 06, 2021 6:49:51 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-05_23_45_33-10006167550902779687 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f4a1676b-cdc2-4474-89b1-c1c512a43618 and timestamp: 2021-09-06T06:49:51.495000000Z:
                     Metric:                    Value:
                   read_time                     9.398
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2021 6:49:52 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 36.84 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 32s
152 actionable tasks: 99 executed, 53 from cache

Publishing build scan...
https://gradle.com/s/5i7kkok6orbgk

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2390

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2390/display/redirect>

Changes:


------------------------------------------
[...truncated 346.87 KB...]
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@70318993]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 06, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 06, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 06, 2021 12:45:03 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 06, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 06, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 06, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 06, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 06, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112459 bytes, hash 68dae5563e39d962cf202919e180e150903c6c950bae892ca0a5dfdcdca8c9cb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aNrlVj452WLPICkZ4YDhUJA8bJULroksoKXf3Nyoycs.pb
    Sep 06, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 06, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 06, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2081981948379655649.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-wC8ji88uIqqXvXRR1z84kSHT2YTMTVoAP72AT_wBUT8.jar
    Sep 06, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 06, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 06, 2021 12:45:10 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 06, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 06, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 06, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 06, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-05_17_45_10-15242399150055407121?project=apache-beam-testing
    Sep 06, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-05_17_45_10-15242399150055407121
    Sep 06, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-05_17_45_10-15242399150055407121
    Sep 06, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-06T00:45:14.697Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 06, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:45:19.040Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 06, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:45:19.727Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 06, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:45:19.765Z: Expanding GroupByKey operations into optimizable parts.
    Sep 06, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:45:19.785Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 06, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:45:19.841Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 06, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:45:19.866Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 06, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:45:19.901Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 06, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:45:20.182Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 06, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:45:20.259Z: Starting 5 workers in us-central1-a...
    Sep 06, 2021 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:45:51.457Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 06, 2021 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:46:05.234Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 06, 2021 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:46:31.608Z: Workers have started successfully.
    Sep 06, 2021 12:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:46:31.656Z: Workers have started successfully.
    Sep 06, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:47:00.599Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 06, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:47:00.754Z: Cleaning up.
    Sep 06, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:47:00.823Z: Stopping worker pool...
    Sep 06, 2021 12:49:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:49:26.150Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 06, 2021 12:49:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-06T00:49:26.197Z: Worker pool stopped.
    Sep 06, 2021 12:49:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-05_17_45_10-15242399150055407121 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4ffeddae-4881-41f2-930f-132e1b8a487c and timestamp: 2021-09-06T00:49:32.837000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.433

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 06, 2021 12:49:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 40.293 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 13s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/lldkllqzxtpes

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2389

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2389/display/redirect>

Changes:


------------------------------------------
[...truncated 346.72 KB...]
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@70318993]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 05, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 05, 2021 6:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 05, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 05, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112460 bytes, hash 331ee535d19f3c12e28d954487172e57358e077f111a4a9885e3d77025e8c324> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Mx7lNdGfPBLijZVEhxcuVzWOB38RGkqYhePXcCXowyQ.pb
    Sep 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7536243845373742245.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-wp0Vg1lfBJKx5Q0-fvd7Ubyd7BRjVWIKP5LRprCiovs.jar
    Sep 05, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 05, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 05, 2021 6:45:08 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 05, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 05, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 05, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 05, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-05_11_45_08-10811630702578601354?project=apache-beam-testing
    Sep 05, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-05_11_45_08-10811630702578601354
    Sep 05, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-05_11_45_08-10811630702578601354
    Sep 05, 2021 6:45:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-05T18:45:12.227Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 05, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:45:16.758Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 05, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:45:17.485Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 05, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:45:17.525Z: Expanding GroupByKey operations into optimizable parts.
    Sep 05, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:45:17.552Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 05, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:45:17.650Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 05, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:45:17.677Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 05, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:45:17.709Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 05, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:45:18.010Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 05, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:45:18.076Z: Starting 5 workers in us-central1-a...
    Sep 05, 2021 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:45:43.973Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 05, 2021 6:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:46:09.696Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 05, 2021 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:46:36.854Z: Workers have started successfully.
    Sep 05, 2021 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:46:36.874Z: Workers have started successfully.
    Sep 05, 2021 6:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:47:04.356Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 05, 2021 6:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:47:04.465Z: Cleaning up.
    Sep 05, 2021 6:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:47:04.521Z: Stopping worker pool...
    Sep 05, 2021 6:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:49:27.479Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 05, 2021 6:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T18:49:27.534Z: Worker pool stopped.
    Sep 05, 2021 6:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-05_11_45_08-10811630702578601354 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d514a0ce-bcf1-47f8-bca6-cb07511bfb21 and timestamp: 2021-09-05T18:49:32.843000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.541

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2021 6:49:33 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 42.809 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 15s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/xw5simwt52532

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2388

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2388/display/redirect>

Changes:


------------------------------------------
[...truncated 376.39 KB...]


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@70318993]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2021 1:34:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2021 1:34:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2021 1:34:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2021 1:34:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2021 1:34:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2021 1:34:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2021 1:34:49 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 05, 2021 1:34:49 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 05, 2021 1:34:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 05, 2021 1:34:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 05, 2021 1:34:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112459 bytes, hash 78c8b2edbe2b55892955aed83347386df3a05337af4d112934234ca29a48bc21> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-eMiy7b4rVYkpVa7YM0c4bfOgUzevTREpNCNMoppIvCE.pb
    Sep 05, 2021 1:34:55 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 05, 2021 1:34:55 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 05, 2021 1:34:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1279157143094325105.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-1nPVfEoA5Bk7WAyYcYKuM7ASyQO-BtC7nlMtexpvXgI.jar
    Sep 05, 2021 1:34:57 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.m2/repository/org/apache/beam/beam-vendor-grpc-1_36_0/0.2/beam-vendor-grpc-1_36_0-0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-nVxp3_1Xbd1ktCTMOtQxL1xQUUS_NwN-RSOhZ5xBwdg.jar
    Sep 05, 2021 1:34:58 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 2 files newly uploaded in 2 seconds
    Sep 05, 2021 1:34:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 05, 2021 1:34:58 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 05, 2021 1:34:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 05, 2021 1:34:58 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 05, 2021 1:34:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 05, 2021 1:34:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-05_06_34_58-11208989136571102888?project=apache-beam-testing
    Sep 05, 2021 1:34:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-05_06_34_58-11208989136571102888
    Sep 05, 2021 1:34:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-05_06_34_58-11208989136571102888
    Sep 05, 2021 1:35:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-05T13:35:02.402Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 05, 2021 1:35:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:35:06.768Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 05, 2021 1:35:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:35:07.301Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 05, 2021 1:35:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:35:07.330Z: Expanding GroupByKey operations into optimizable parts.
    Sep 05, 2021 1:35:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:35:07.362Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 05, 2021 1:35:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:35:07.438Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 05, 2021 1:35:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:35:07.473Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 05, 2021 1:35:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:35:07.507Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 05, 2021 1:35:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:35:07.806Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 05, 2021 1:35:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:35:07.885Z: Starting 5 workers in us-central1-a...
    Sep 05, 2021 1:35:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:35:17.092Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 05, 2021 1:35:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:35:54.082Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 05, 2021 1:36:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:36:22.443Z: Workers have started successfully.
    Sep 05, 2021 1:36:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:36:22.466Z: Workers have started successfully.
    Sep 05, 2021 1:36:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:36:46.374Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 05, 2021 1:36:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:36:46.516Z: Cleaning up.
    Sep 05, 2021 1:36:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:36:46.592Z: Stopping worker pool...
    Sep 05, 2021 1:39:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:39:07.734Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 05, 2021 1:39:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T13:39:07.783Z: Worker pool stopped.
    Sep 05, 2021 1:39:14 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-05_06_34_58-11208989136571102888 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2d97a785-7a81-4759-832a-a760c358f614 and timestamp: 2021-09-05T13:39:14.847000000Z:
                     Metric:                    Value:
                   read_time                     6.515
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2021 1:39:15 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 39.79 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 16s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/uzswpvyslljcq

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2387

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2387/display/redirect>

Changes:


------------------------------------------
[...truncated 357.68 KB...]
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@72019671]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2021 6:49:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2021 6:49:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2021 6:49:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2021 6:49:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2021 6:49:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2021 6:49:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2021 6:49:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 05, 2021 6:49:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 05, 2021 6:49:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 05, 2021 6:49:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 05, 2021 6:49:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112458 bytes, hash ba890ab37a3bb1648c134de1cc6338922f829634d8682dfd223beb65fdbc6080> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-uokKs3o7sWSME03hzGM4ki-CljTYaC39IjvrZf28YIA.pb
    Sep 05, 2021 6:49:44 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 05, 2021 6:49:45 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 05, 2021 6:49:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1397531034086354658.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-DJkiL0IMl6536bjJglv_tQMKLx3lMnKSJZRy7qcwStU.jar
    Sep 05, 2021 6:49:50 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 5 seconds
    Sep 05, 2021 6:49:50 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 05, 2021 6:49:50 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 05, 2021 6:49:52 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 05, 2021 6:49:52 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 05, 2021 6:49:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 05, 2021 6:49:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-04_23_49_53-10880493921920342660?project=apache-beam-testing
    Sep 05, 2021 6:49:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-04_23_49_53-10880493921920342660
    Sep 05, 2021 6:49:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-04_23_49_53-10880493921920342660
    Sep 05, 2021 6:49:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-05T06:49:56.711Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 05, 2021 6:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:50:02.935Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 05, 2021 6:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:50:03.511Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 05, 2021 6:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:50:03.555Z: Expanding GroupByKey operations into optimizable parts.
    Sep 05, 2021 6:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:50:03.587Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 05, 2021 6:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:50:03.659Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 05, 2021 6:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:50:03.692Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 05, 2021 6:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:50:03.726Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 05, 2021 6:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:50:04.142Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 05, 2021 6:50:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:50:04.222Z: Starting 5 workers in us-central1-c...
    Sep 05, 2021 6:50:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:50:32.292Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 05, 2021 6:50:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:50:49.604Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 05, 2021 6:51:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:51:14.650Z: Workers have started successfully.
    Sep 05, 2021 6:51:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:51:14.684Z: Workers have started successfully.
    Sep 05, 2021 6:51:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:51:44.035Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 05, 2021 6:51:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:51:44.198Z: Cleaning up.
    Sep 05, 2021 6:51:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:51:44.266Z: Stopping worker pool...
    Sep 05, 2021 6:54:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:54:05.921Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 05, 2021 6:54:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T06:54:05.970Z: Worker pool stopped.
    Sep 05, 2021 6:54:11 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-04_23_49_53-10880493921920342660 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 15539c25-47ac-431c-85cf-9f78ac6c2158 and timestamp: 2021-09-05T06:54:11.921000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.324

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2021 6:54:12 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.094 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.09 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 5 mins 48.465 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 42s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/psv7h3nv4lyak

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2386

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2386/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12769] Adds support for expanding a Java cross-language transform


------------------------------------------
[...truncated 352.86 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2134971427]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 05, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 05, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 05, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 05, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 05, 2021 12:45:26 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 05, 2021 12:45:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 05, 2021 12:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 05, 2021 12:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112463 bytes, hash f41cb218a1041a39cf3cf7afdb72df9a847c5a7d57d926c1dad92dd8bd1e3f95> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9ByyGKEEGjnPPPev23LfmoR8Wn1X2SbB2tkt2L0eP5U.pb
    Sep 05, 2021 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 05, 2021 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 05, 2021 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2707828256434146384.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-jf_V0bMI3mpHFu88TUVTELO7vySycrb3r_Bta5kQDEQ.jar
    Sep 05, 2021 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 05, 2021 12:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 05, 2021 12:45:32 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 05, 2021 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 05, 2021 12:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 05, 2021 12:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 05, 2021 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-04_17_45_33-10693535116842576941?project=apache-beam-testing
    Sep 05, 2021 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-04_17_45_33-10693535116842576941
    Sep 05, 2021 12:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-04_17_45_33-10693535116842576941
    Sep 05, 2021 12:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-05T00:45:36.848Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 05, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:45:43.482Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 05, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:45:44.128Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 05, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:45:44.164Z: Expanding GroupByKey operations into optimizable parts.
    Sep 05, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:45:44.200Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 05, 2021 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:45:44.273Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 05, 2021 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:45:44.307Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 05, 2021 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:45:44.348Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 05, 2021 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:45:44.812Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 05, 2021 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:45:44.900Z: Starting 5 workers in us-central1-c...
    Sep 05, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:46:07.834Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 05, 2021 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:46:19.665Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 05, 2021 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:46:19.693Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 05, 2021 12:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:46:29.998Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 05, 2021 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:46:53.614Z: Workers have started successfully.
    Sep 05, 2021 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:46:53.691Z: Workers have started successfully.
    Sep 05, 2021 12:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:47:24.873Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 05, 2021 12:47:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:47:25.026Z: Cleaning up.
    Sep 05, 2021 12:47:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:47:25.120Z: Stopping worker pool...
    Sep 05, 2021 12:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:49:42.646Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 05, 2021 12:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-05T00:49:42.694Z: Worker pool stopped.
    Sep 05, 2021 12:49:48 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-04_17_45_33-10693535116842576941 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f926465d-b3e6-40ed-9814-76040c167621 and timestamp: 2021-09-05T00:49:48.149000000Z:
                     Metric:                    Value:
                   read_time                     9.682
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 05, 2021 12:49:48 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 34.361 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 28s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/4u2tvivgrblcw

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2385

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2385/display/redirect>

Changes:


------------------------------------------
[...truncated 347.67 KB...]
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 04, 2021 6:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 04, 2021 6:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 04, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 04, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112461 bytes, hash ecf29a12b161196abd947e61668899f94fe6ec1fe7866013d31e744d8cd6c442> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7PKaErFhGWq9lH5hZoiZ-U_m7B_nhmAT0x50TYzWxEI.pb
    Sep 04, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 04, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3084273916801856233.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-TPyAvegaVjPfkTwlGinwhxXob9UjXOPbZjd03XwpZD4.jar
    Sep 04, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 04, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Sep 04, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Sep 04, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Sep 04, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
    Sep 04, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 243 files cached, 5 files newly uploaded in 0 seconds
    Sep 04, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 04, 2021 6:45:08 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 04, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 04, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 04, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 04, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-04_11_45_08-8099499084905887704?project=apache-beam-testing
    Sep 04, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-04_11_45_08-8099499084905887704
    Sep 04, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-04_11_45_08-8099499084905887704
    Sep 04, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-04T18:45:12.097Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 04, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:45:19.724Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 04, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:45:20.499Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 04, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:45:20.532Z: Expanding GroupByKey operations into optimizable parts.
    Sep 04, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:45:20.563Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 04, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:45:20.632Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 04, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:45:20.668Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 04, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:45:20.703Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 04, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:45:21.081Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 04, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:45:21.161Z: Starting 5 workers in us-central1-c...
    Sep 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:45:33.227Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 04, 2021 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:46:05.860Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 04, 2021 6:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:46:30.741Z: Workers have started successfully.
    Sep 04, 2021 6:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:46:30.771Z: Workers have started successfully.
    Sep 04, 2021 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:47:01.658Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 04, 2021 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:47:01.881Z: Cleaning up.
    Sep 04, 2021 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:47:01.992Z: Stopping worker pool...
    Sep 04, 2021 6:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:49:19.784Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 04, 2021 6:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T18:49:19.818Z: Worker pool stopped.
    Sep 04, 2021 6:49:25 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-04_11_45_08-8099499084905887704 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8f7b30f5-0af2-498b-86f6-a390c8855b77 and timestamp: 2021-09-04T18:49:25.088000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.303

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2021 6:49:25 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
    WARNING: Missing property -- measurement/database. Metrics won't be published.

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 35.083 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 7s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/5zh6rmapekmfa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2384

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2384/display/redirect>

Changes:


------------------------------------------
[...truncated 346.89 KB...]
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@286566052]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 04, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 04, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 04, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 04, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112459 bytes, hash 26c7ed55e31cbda68728b581403b027ebb42182e9a7e895eb7a0defb3088c8a6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-JsftVeMcvaaHKLWBQDsCfrtCGC6afolet6De-zCIyKY.pb
    Sep 04, 2021 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 04, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7050467490238436215.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-K8vlJNYVl3mXkMVuFyLisVTl64pJsfw8H3wy39zXD98.jar
    Sep 04, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 04, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 04, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 04, 2021 12:45:13 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 04, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 04, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 04, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 04, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-04_05_45_14-4678713245512965619?project=apache-beam-testing
    Sep 04, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-04_05_45_14-4678713245512965619
    Sep 04, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-04_05_45_14-4678713245512965619
    Sep 04, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-04T12:45:17.341Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 04, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:45:22.576Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:45:23.309Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:45:23.349Z: Expanding GroupByKey operations into optimizable parts.
    Sep 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:45:23.375Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:45:23.439Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:45:23.468Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:45:23.495Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:45:23.840Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:45:23.908Z: Starting 5 workers in us-central1-c...
    Sep 04, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:45:36.339Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 04, 2021 12:46:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:46:09.856Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 04, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:46:35.394Z: Workers have started successfully.
    Sep 04, 2021 12:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:46:35.426Z: Workers have started successfully.
    Sep 04, 2021 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:47:03.940Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 04, 2021 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:47:04.091Z: Cleaning up.
    Sep 04, 2021 12:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:47:04.177Z: Stopping worker pool...
    Sep 04, 2021 12:49:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:49:22.360Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 04, 2021 12:49:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T12:49:22.399Z: Worker pool stopped.
    Sep 04, 2021 12:49:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-04_05_45_14-4678713245512965619 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 95d345fa-ad6d-487e-bb39-d90b3bd9b1af and timestamp: 2021-09-04T12:49:27.956000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.659

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2021 12:49:28 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 34.895 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/6p72oxi7zo6ga

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2383

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2383/display/redirect>

Changes:


------------------------------------------
[...truncated 347.27 KB...]
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@70318993]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 04, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 04, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 04, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 04, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112461 bytes, hash 534377993bcd3c3a171072dc27c937ef03b56ce5fe894be8687414c5d3dc38b8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-U0N3mTvNPDoXEHLcJ8k37wO1bOX-iUvoaHQUxdPcOLg.pb
    Sep 04, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 04, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5268561719917520832.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-6oqD1whRXoQ6yzDZpHOvqd5p5vxwSFJo4IcBF59oY3c.jar
    Sep 04, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 04, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 04, 2021 6:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 04, 2021 6:45:11 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 04, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 04, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 04, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 04, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-03_23_45_12-2301445323111670089?project=apache-beam-testing
    Sep 04, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-03_23_45_12-2301445323111670089
    Sep 04, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-03_23_45_12-2301445323111670089
    Sep 04, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-04T06:45:15.483Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 04, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:45:20.939Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 04, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:45:21.722Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 04, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:45:21.769Z: Expanding GroupByKey operations into optimizable parts.
    Sep 04, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:45:21.816Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 04, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:45:21.882Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 04, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:45:21.918Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 04, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:45:21.951Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 04, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:45:22.279Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 04, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:45:22.345Z: Starting 5 workers in us-central1-a...
    Sep 04, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:45:32.545Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 04, 2021 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:46:08.178Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 04, 2021 6:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:46:34.026Z: Workers have started successfully.
    Sep 04, 2021 6:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:46:34.057Z: Workers have started successfully.
    Sep 04, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:47:01.774Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 04, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:47:01.903Z: Cleaning up.
    Sep 04, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:47:01.980Z: Stopping worker pool...
    Sep 04, 2021 6:49:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:49:21.464Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 04, 2021 6:49:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T06:49:21.511Z: Worker pool stopped.
    Sep 04, 2021 6:49:27 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-03_23_45_12-2301445323111670089 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c669b485-edcd-46b4-8afc-70612dc56ade and timestamp: 2021-09-04T06:49:27.517000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.149

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2021 6:49:28 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.043 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 34.236 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 9s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/a6q5lud2o6yau

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2382

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2382/display/redirect>

Changes:


------------------------------------------
[...truncated 347.82 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@286566052]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 04, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 04, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 04, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 04, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 04, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 04, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 04, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 04, 2021 12:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112459 bytes, hash b111799e38c3a6be8ecbf7bd01c73aea54afbb5883e872b652262ca5720e2184> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-sRF5njjDpr6Oy_e9Acc66lSvu1iD6HK2UiYspXIOIYQ.pb
    Sep 04, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 04, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 04, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7519257491340368708.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lU6yTasXEy1jKZc6V84VLRmEPBoMpZZhxZzo1R-6ALE.jar
    Sep 04, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.69/6cb063f1d527ff65bdbb9ea74888a5ffc3f92197/fastjson-1.2.69.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.69-KniRdEoDeOAJmpuB3b7U0uGn6GMvb4b4_8-jg-UeScg.jar
    Sep 04, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 2 files newly uploaded in 0 seconds
    Sep 04, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 04, 2021 12:45:11 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 04, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 04, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 04, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 04, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-03_17_45_12-18161074009312431671?project=apache-beam-testing
    Sep 04, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-03_17_45_12-18161074009312431671
    Sep 04, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-03_17_45_12-18161074009312431671
    Sep 04, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-04T00:45:15.504Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 04, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:20.637Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 04, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:21.298Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 04, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:21.341Z: Expanding GroupByKey operations into optimizable parts.
    Sep 04, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:21.375Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 04, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:21.442Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 04, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:21.470Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 04, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:21.504Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 04, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:21.872Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 04, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:21.945Z: Starting 5 workers in us-central1-a...
    Sep 04, 2021 12:45:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:46.022Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 04, 2021 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:59.192Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 04, 2021 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:45:59.217Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 04, 2021 12:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:46:09.466Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 04, 2021 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:46:33.922Z: Workers have started successfully.
    Sep 04, 2021 12:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:46:33.958Z: Workers have started successfully.
    Sep 04, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:47:01.658Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 04, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:47:01.795Z: Cleaning up.
    Sep 04, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:47:01.862Z: Stopping worker pool...
    Sep 04, 2021 12:49:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:49:29.082Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 04, 2021 12:49:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-04T00:49:29.134Z: Worker pool stopped.
    Sep 04, 2021 12:49:35 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-03_17_45_12-18161074009312431671 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): afc124b2-d3df-4bf3-b447-60d39de3bdf2 and timestamp: 2021-09-04T00:49:35.695000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.903

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 04, 2021 12:49:36 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 42.676 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 17s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/2f4ix2irnoptu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2381

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2381/display/redirect>

Changes:


------------------------------------------
[...truncated 346.84 KB...]
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@70318993]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2021 6:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2021 6:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2021 6:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2021 6:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2021 6:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2021 6:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2021 6:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 03, 2021 6:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 03, 2021 6:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 03, 2021 6:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 03, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112460 bytes, hash 9925b8aebfc1d4920ccbea8ef72f91fbb7e0d1b9812bd48b266d817398fa63b8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-mSW4rr_B1JIMy-qO9y-R-7fg0bmBK9SLJm2Bc5j6Y7g.pb
    Sep 03, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 03, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test101276196055653085.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-YRglKCfmXPqyJHwBe-8liFPzC0Yx-0C9HsvWNnaIa54.jar
    Sep 03, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 03, 2021 6:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 03, 2021 6:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 03, 2021 6:45:31 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 03, 2021 6:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 03, 2021 6:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 03, 2021 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 03, 2021 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-03_11_45_32-10410911850525840854?project=apache-beam-testing
    Sep 03, 2021 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-03_11_45_32-10410911850525840854
    Sep 03, 2021 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-03_11_45_32-10410911850525840854
    Sep 03, 2021 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-03T18:45:35.566Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 03, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:45:43.672Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 03, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:45:44.453Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 03, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:45:44.498Z: Expanding GroupByKey operations into optimizable parts.
    Sep 03, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:45:44.528Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 03, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:45:44.604Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 03, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:45:44.629Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 03, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:45:44.664Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 03, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:45:45.147Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 03, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:45:45.233Z: Starting 5 workers in us-central1-c...
    Sep 03, 2021 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:46:15.004Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 03, 2021 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:46:30.091Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 03, 2021 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:46:55.896Z: Workers have started successfully.
    Sep 03, 2021 6:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:46:55.924Z: Workers have started successfully.
    Sep 03, 2021 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:47:27.166Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 03, 2021 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:47:27.309Z: Cleaning up.
    Sep 03, 2021 6:47:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:47:27.401Z: Stopping worker pool...
    Sep 03, 2021 6:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:49:47.444Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 03, 2021 6:49:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T18:49:47.496Z: Worker pool stopped.
    Sep 03, 2021 6:49:52 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-03_11_45_32-10410911850525840854 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 48eea044-a623-4a09-89dc-c7f1e5afca36 and timestamp: 2021-09-03T18:49:53Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     6.644

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2021 6:49:53 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 42.734 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 20s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/eb3crdwfzk5ty

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2380

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2380/display/redirect?page=changes>

Changes:

[aydar.zaynutdinov] [BEAM-11873] Add support for writes with returning values in JdbcIO


------------------------------------------
[...truncated 361.94 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1121334267]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2021 12:50:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2021 12:50:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2021 12:50:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2021 12:50:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2021 12:50:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2021 12:50:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2021 12:50:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 03, 2021 12:50:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 03, 2021 12:50:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 03, 2021 12:50:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 03, 2021 12:50:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112460 bytes, hash 2e354590ef801a216f260851173a8ca974f7d28030aa974a58fb40f5ed76feaa> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LjVFkO-AGiFvJghRFzqMqXT30oAwqpdKWPtA9e12_qo.pb
    Sep 03, 2021 12:50:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 03, 2021 12:50:38 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 03, 2021 12:50:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3018342601264430199.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-EQWikt4wAkgOHsKMlo92X1AI8bsSGjLtuzLANCPfRXk.jar
    Sep 03, 2021 12:50:44 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 6 seconds
    Sep 03, 2021 12:50:45 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 03, 2021 12:50:45 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 03, 2021 12:50:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 03, 2021 12:50:46 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 03, 2021 12:50:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 03, 2021 12:50:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-03_05_50_48-7456646592635181481?project=apache-beam-testing
    Sep 03, 2021 12:50:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-03_05_50_48-7456646592635181481
    Sep 03, 2021 12:50:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-03_05_50_48-7456646592635181481
    Sep 03, 2021 12:50:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-03T12:50:51.538Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 03, 2021 12:50:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:50:59.206Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 03, 2021 12:51:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:00.073Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 03, 2021 12:51:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:00.101Z: Expanding GroupByKey operations into optimizable parts.
    Sep 03, 2021 12:51:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:00.132Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 03, 2021 12:51:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:00.205Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 03, 2021 12:51:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:00.244Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 03, 2021 12:51:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:00.275Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 03, 2021 12:51:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:00.660Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 03, 2021 12:51:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:00.739Z: Starting 5 workers in us-central1-c...
    Sep 03, 2021 12:51:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:25.158Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 03, 2021 12:51:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:36.762Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Sep 03, 2021 12:51:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:36.791Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Sep 03, 2021 12:51:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:51:47.149Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 03, 2021 12:52:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:52:10.640Z: Workers have started successfully.
    Sep 03, 2021 12:52:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:52:10.668Z: Workers have started successfully.
    Sep 03, 2021 12:52:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:52:40.301Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 03, 2021 12:52:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:52:40.452Z: Cleaning up.
    Sep 03, 2021 12:52:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:52:40.540Z: Stopping worker pool...
    Sep 03, 2021 12:54:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:54:57.756Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 03, 2021 12:54:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T12:54:57.833Z: Worker pool stopped.
    Sep 03, 2021 12:55:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-03_05_50_48-7456646592635181481 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c1a15772-5fc4-4c23-9853-48611afba533 and timestamp: 2021-09-03T12:55:04.222000000Z:
                     Metric:                    Value:
                   read_time                      8.35
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2021 12:55:06 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.248 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.23 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 5 mins 46.996 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 22s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/zrweouhbad3ic

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2379

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2379/display/redirect?page=changes>

Changes:

[Andrew Pilloud] [BEAM-12680] Calcite SqlTransform no longer experimental


------------------------------------------
[...truncated 351.35 KB...]


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@286566052]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2021 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2021 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2021 6:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2021 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2021 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2021 6:45:24 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 03, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 03, 2021 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 03, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 03, 2021 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112460 bytes, hash 408cd3d127337593eb7a3f0cf8ca306acfc7762cd8c387b011327a5cfa310e31> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QIzT0SczdZPrej8M-Mowas_HdizYw4ewETJ6XPoxDjE.pb
    Sep 03, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 03, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-NzuIuC0tWIGa8mV7asF_NOtTs4zf7xUyMI972IiYOJo.jar
    Sep 03, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9092202468420968078.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-BdLfzE8ukUtVSY0FvVhGjkcHODSRk6YRre6SD9JtTPE.jar
    Sep 03, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 03, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 2 files newly uploaded in 0 seconds
    Sep 03, 2021 6:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 03, 2021 6:45:31 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 03, 2021 6:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 03, 2021 6:45:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 03, 2021 6:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 03, 2021 6:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-02_23_45_31-6395771659507821382?project=apache-beam-testing
    Sep 03, 2021 6:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-02_23_45_31-6395771659507821382
    Sep 03, 2021 6:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-02_23_45_31-6395771659507821382
    Sep 03, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-03T06:45:35.171Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 03, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:45:40.138Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 03, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:45:40.952Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 03, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:45:40.999Z: Expanding GroupByKey operations into optimizable parts.
    Sep 03, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:45:41.027Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 03, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:45:41.092Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 03, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:45:41.118Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 03, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:45:41.150Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 03, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:45:41.478Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 03, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:45:41.545Z: Starting 5 workers in us-central1-a...
    Sep 03, 2021 6:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:46:12.507Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 03, 2021 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:46:25.361Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 03, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:46:55.167Z: Workers have started successfully.
    Sep 03, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:46:55.188Z: Workers have started successfully.
    Sep 03, 2021 6:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:47:21.396Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 03, 2021 6:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:47:21.514Z: Cleaning up.
    Sep 03, 2021 6:47:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:47:21.582Z: Stopping worker pool...
    Sep 03, 2021 6:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:49:42.524Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 03, 2021 6:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T06:49:42.569Z: Worker pool stopped.
    Sep 03, 2021 6:49:51 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-02_23_45_31-6395771659507821382 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 15639960-c59e-4ada-9121-07c04f0b0da3 and timestamp: 2021-09-03T06:49:51.615000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     6.425

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2021 6:49:52 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 38.174 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 29s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/7bs5ms5dihfh4

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2378

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2378/display/redirect?page=changes>

Changes:

[Andrew Pilloud] s/org.apache.beam.vendor.calcite.v1_20_0/org.apache.beam.vendor.calcite.v1_26_0/g

[Andrew Pilloud] [BEAM-9379] Update to vendored Calcite to 1.26.0

[Andrew Pilloud] Fix flattened rows

[Andrew Pilloud] Fix DDL

[Andrew Pilloud] Handle BeamRelNode in RelSubset

[Andrew Pilloud] Fix BeamIOPushDown

[Andrew Pilloud] [BEAM-9190] Update BeamBigQuerySqlDialect

[Andrew Pilloud] Remap IN to Search

[Andrew Pilloud] [BEAM-9379] Use byte[] instead of ByteString for (VAR)BINARY in UDFs.

[Andrew Pilloud] [BEAM-9379] Update UDF NULL type mismatch test since there is stricter

[Andrew Pilloud] Fix ZetaSQL window function mapping

[Andrew Pilloud] Fix Bigtable tests that depend on SQL types

[Andrew Pilloud] Workaround CALCITE-4759 in JoinPushThroughJoinRule

[Andrew Pilloud] Disable nested bytes tests, sorry!

[Andrew Pilloud] SqlLine is rotting, Just CAST types for now

[Andrew Pilloud] Update CHANGES.md

[Andrew Pilloud] Up spotbug stack size

[Andrew Pilloud] Fix BeamMatchRel copy

[Andrew Pilloud] partitionKey everywhere

[Andrew Pilloud] Make it functional

[Andrew Pilloud] Update CreateFunction

[Andrew Pilloud] No tpcds dependency

[Andrew Pilloud] Fix default time types

[Ankur Goenka] fixing release date

[noreply] [BEAM-12823] TestStream Support in Samza Portable Runner (#15421)


------------------------------------------
[...truncated 364.70 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2021 12:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2021 12:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2021 12:46:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 03, 2021 12:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 03, 2021 12:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 03, 2021 12:46:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 03, 2021 12:46:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 03, 2021 12:46:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2
    Sep 03, 2021 12:46:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 03, 2021 12:46:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 03, 2021 12:46:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <112460 bytes, hash 7e29938c75dd500f595719334d07a824da330071ee38dce689c08b1d088e6ed8> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-fimTjHXdUA9ZVxkzTQeoJNozAHHuONzmicCLHQiObtg.pb
    Sep 03, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 03, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4667068793271008819.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-FaVVKvpkSVt0TquI7OuZqoyo34Rbag65zR13YcFz-HY.jar
    Sep 03, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-O4FtkomfozNZ5s1ThCH4YVQsv5jjgpXjdfBRuAgh72g.jar
    Sep 03, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-QwBCBtprx84y_3PbjBTvra8IZ_z5DGx5ADyBhvZp408.jar
    Sep 03, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests-b_iPDtxotfaDaPPyMCbCTp8vYSu2A6yRI3xwtrNkFTk.jar
    Sep 03, 2021 12:46:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_26_0/0.1/5fae4e97a2d8739462bd1572e48d01228766b6ef/beam-vendor-calcite-1_26_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_26_0-0.1-pYZ7esxRWyhKmBqBdfrpnxvg8woyykTvGbaCvLtyRyA.jar
    Sep 03, 2021 12:46:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 244 files cached, 4 files newly uploaded in 1 seconds
    Sep 03, 2021 12:46:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 03, 2021 12:46:10 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 03, 2021 12:46:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2
    Sep 03, 2021 12:46:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Sep 03, 2021 12:46:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 03, 2021 12:46:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-02_17_46_10-9110703423107782645?project=apache-beam-testing
    Sep 03, 2021 12:46:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-02_17_46_10-9110703423107782645
    Sep 03, 2021 12:46:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-02_17_46_10-9110703423107782645
    Sep 03, 2021 12:46:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-03T00:46:13.946Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 03, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:46:19.639Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 03, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:46:20.455Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 03, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:46:20.497Z: Expanding GroupByKey operations into optimizable parts.
    Sep 03, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:46:20.534Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 03, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:46:20.620Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 03, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:46:20.655Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 03, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:46:20.703Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor)
    Sep 03, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:46:21.085Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 03, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:46:21.157Z: Starting 5 workers in us-central1-a...
    Sep 03, 2021 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:46:26.342Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 03, 2021 12:47:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:47:06.735Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 03, 2021 12:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:47:32.227Z: Workers have started successfully.
    Sep 03, 2021 12:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:47:32.268Z: Workers have started successfully.
    Sep 03, 2021 12:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:48:00.519Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor)
    Sep 03, 2021 12:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:48:00.639Z: Cleaning up.
    Sep 03, 2021 12:48:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:48:00.721Z: Stopping worker pool...
    Sep 03, 2021 12:50:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:50:19.625Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 03, 2021 12:50:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-03T00:50:19.672Z: Worker pool stopped.
    Sep 03, 2021 12:50:26 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-02_17_46_10-9110703423107782645 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2de0ee76-a6d9-4cb0-96bb-160cf71e7701 and timestamp: 2021-09-03T00:50:26.230000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.841

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 03, 2021 12:50:26 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 5 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 35.033 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 4s
152 actionable tasks: 109 executed, 43 from cache

Publishing build scan...
https://gradle.com/s/zyrh4jzvbum5i

Stopped 4 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2377

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2377/display/redirect?page=changes>

Changes:

[Steve Niemitz] [BEAM-12767] Add another unit test for PipelineOptions deserialization


------------------------------------------
[...truncated 350.74 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2021 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2021 6:45:11 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2021 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 02, 2021 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2021 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2021 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2021 6:45:12 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 02, 2021 6:45:12 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 02, 2021 6:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 02, 2021 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 02, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115841 bytes, hash 7067f75afeddc8805cd16f3a75349968e4223819644f6776a68c2bb6c0f5d718> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cGf3Wv7dyIBc0W86dTSZaOQiOBlkT2d2powrtsD11xg.pb
    Sep 02, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 02, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ytej5qC_OvUfqwQf0G2BZteGa2RtQ9zU6RVC-b8HFUs.jar
    Sep 02, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4747870347571417666.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-eYw9YTW-ipkH6dFR6fXA5q5lpRpHCxEfYh-nNWE34rU.jar
    Sep 02, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 02, 2021 6:45:19 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 02, 2021 6:45:19 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 02, 2021 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 02, 2021 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 02, 2021 6:45:20 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 02, 2021 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 02, 2021 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-02_11_45_20-14835984397004002849?project=apache-beam-testing
    Sep 02, 2021 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-02_11_45_20-14835984397004002849
    Sep 02, 2021 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-02_11_45_20-14835984397004002849
    Sep 02, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-02T18:45:23.585Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 02, 2021 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:45:32.068Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 02, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:45:32.809Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 02, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:45:32.850Z: Expanding GroupByKey operations into optimizable parts.
    Sep 02, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:45:32.889Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 02, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:45:32.980Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 02, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:45:33.003Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 02, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:45:33.039Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 02, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:45:33.065Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 02, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:45:33.431Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:45:33.505Z: Starting 5 workers in us-central1-c...
    Sep 02, 2021 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:45:50.464Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 02, 2021 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:46:09.600Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Sep 02, 2021 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:46:09.655Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Sep 02, 2021 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:46:19.981Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 02, 2021 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:46:44.142Z: Workers have started successfully.
    Sep 02, 2021 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:46:44.168Z: Workers have started successfully.
    Sep 02, 2021 6:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:47:15.976Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2021 6:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:47:16.110Z: Cleaning up.
    Sep 02, 2021 6:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:47:16.181Z: Stopping worker pool...
    Sep 02, 2021 6:49:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:49:40.228Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 02, 2021 6:49:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T18:49:40.275Z: Worker pool stopped.
    Sep 02, 2021 6:49:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-02_11_45_20-14835984397004002849 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5debfae6-ca91-4476-99e2-a81f6e9904fa and timestamp: 2021-09-02T18:49:45.802000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.294

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2021 6:49:46 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 43.319 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 28s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/r2gn2oftdlaiq

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2376

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2376/display/redirect?page=changes>

Changes:

[aromanenko.dev] Remove a template line


------------------------------------------
[...truncated 349.20 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@241640660]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2021 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2021 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2021 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 02, 2021 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2021 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2021 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2021 12:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 02, 2021 12:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 02, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 02, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 02, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115844 bytes, hash 69001a9fb23ed6cfd4d03cabbd74cfee034c53bf170d1db282eeeaf19be903b6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-aQAan7I-1s_U0DyrvXTP7gNMU78XDR2ygu7q8ZvpA7Y.pb
    Sep 02, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 02, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2631922772940142491.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-bjh84ewULZG8cxb6ClKBwmwu0GM6hsoJcvzdBT05Zio.jar
    Sep 02, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ytej5qC_OvUfqwQf0G2BZteGa2RtQ9zU6RVC-b8HFUs.jar
    Sep 02, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 02, 2021 12:45:27 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 02, 2021 12:45:27 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 02, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 02, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 02, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 02, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 02, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-02_05_45_28-3766918607904455920?project=apache-beam-testing
    Sep 02, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-02_05_45_28-3766918607904455920
    Sep 02, 2021 12:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-02_05_45_28-3766918607904455920
    Sep 02, 2021 12:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-02T12:45:31.777Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 02, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:45:39.782Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 02, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:45:40.516Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 02, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:45:40.557Z: Expanding GroupByKey operations into optimizable parts.
    Sep 02, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:45:40.595Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 02, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:45:40.656Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 02, 2021 12:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:45:40.687Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:45:40.722Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:45:40.748Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:45:41.086Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:45:41.153Z: Starting 5 workers in us-central1-c...
    Sep 02, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:46:01.670Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 02, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:46:26.465Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 02, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:46:51.562Z: Workers have started successfully.
    Sep 02, 2021 12:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:46:51.617Z: Workers have started successfully.
    Sep 02, 2021 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:47:29.108Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2021 12:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:47:29.250Z: Cleaning up.
    Sep 02, 2021 12:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:47:29.327Z: Stopping worker pool...
    Sep 02, 2021 12:49:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:49:45.356Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 02, 2021 12:49:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T12:49:45.401Z: Worker pool stopped.
    Sep 02, 2021 12:49:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-02_05_45_28-3766918607904455920 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8d4ae825-6f23-403e-b5de-55be49b7e4cd and timestamp: 2021-09-02T12:49:51.404000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     14.72

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2021 12:49:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 41.022 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 33s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/uvo6h4w77ia5i

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2375

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2375/display/redirect?page=changes>

Changes:

[kawaigin] Updated Linux golden screenshots for notebook integration tests.


------------------------------------------
[...truncated 351.50 KB...]
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 02, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 02, 2021 6:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 02, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 02, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash a424b83217b6e73ceb58b78a23d9892afcfa978e440c94256f77afe8e7946dfc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pCS4Mhe25zzrWLeKI9mJKvz6l45EDJQlb3ev6OeUbfw.pb
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ytej5qC_OvUfqwQf0G2BZteGa2RtQ9zU6RVC-b8HFUs.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5587970716536340.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-DQYNemH78y-8o9_6B7i9j_utMAkA2lEMLzRA84hXxmQ.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-api/3.2.7/81408fc988c229ea11354fee9902c47842343f04/docker-java-api-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-api-3.2.7-AwtXDacySf3gUHoixVjjNqW0wUI0YcGJjMO-N7kxBho.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport-zerodep/3.2.7/d74cb05fabc57e9ec441dc39d0bc58ad649fad3d/docker-java-transport-zerodep-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-zerodep-3.2.7-O2a9rUb-899yDZMu4E8j9ItFOkNoggqbLP2xaLmOwJY.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-bigtable-emulator/0.125.2/e2c4eccdc638e5883b658a222b99a318a817f3c6/google-cloud-bigtable-emulator-0.125.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-bigtable-emulator-0.125.2-FiUK-2Jw2KpBfAi4-J15Ft5rFwkLvGw0DsE7fz_A75M.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport/3.2.7/315903a129f530422747efc163dd255f0fa2555e/docker-java-transport-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-3.2.7-ynNXlSXhDSyiLCXkA4c9E9_HGeKB7-wbpEzmswFrI-A.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_20_0/0.1/6d16a59dc771784789116607a04acd9a0ef91d16/beam-vendor-calcite-1_20_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_20_0-0.1-1NrX_9FNKiEqNk5qBOaRlj-IwqOvKvQIGIbTVgm_v8Y.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.15.1/ccd374ca7e5d8b06fb31ecffeb03abc42feada84/kafka-1.15.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.15.1-Z4qZ3jdWXDvdB7xD9VivkdVQiRZWNWUhPSIANVCSZDA.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.15.1/91e6dfab8f141f77c6a0dd147a94bd186993a22c/testcontainers-1.15.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.15.1-_fMrLpRXSowMnAISRc73bMk5UqrXNnsv6vkPXOZRXu0.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Sep 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Sep 02, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 235 files cached, 13 files newly uploaded in 2 seconds
    Sep 02, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 02, 2021 6:45:06 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 02, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 02, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 02, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 02, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 02, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-01_23_45_06-3561076597603865314?project=apache-beam-testing
    Sep 02, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-01_23_45_06-3561076597603865314
    Sep 02, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-01_23_45_06-3561076597603865314
    Sep 02, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-02T06:45:10.286Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 02, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:45:15.700Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 02, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:45:16.576Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 02, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:45:16.614Z: Expanding GroupByKey operations into optimizable parts.
    Sep 02, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:45:16.641Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 02, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:45:16.699Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 02, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:45:16.733Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 02, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:45:16.765Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 02, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:45:16.814Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 02, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:45:17.125Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:45:17.207Z: Starting 5 workers in us-central1-a...
    Sep 02, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:45:27.216Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 02, 2021 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:46:01.385Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Sep 02, 2021 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:46:01.414Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Sep 02, 2021 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:46:11.694Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 02, 2021 6:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:46:34.703Z: Workers have started successfully.
    Sep 02, 2021 6:46:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:46:34.731Z: Workers have started successfully.
    Sep 02, 2021 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:47:02.869Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2021 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:47:02.988Z: Cleaning up.
    Sep 02, 2021 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:47:03.064Z: Stopping worker pool...
    Sep 02, 2021 6:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:49:26.799Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 02, 2021 6:49:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T06:49:26.841Z: Worker pool stopped.
    Sep 02, 2021 6:49:34 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-01_23_45_06-3561076597603865314 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 60391825-b7ab-428a-85b8-7156f824409e and timestamp: 2021-09-02T06:49:34.492000000Z:
                     Metric:                    Value:
                   read_time                     6.989
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2021 6:49:34 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 45.668 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 15s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/bccyxmzv2f7vc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2374

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2374/display/redirect?page=changes>

Changes:

[Steve Niemitz] [BEAM-12767] Fix handling display data in pipeline serialization


------------------------------------------
[...truncated 348.13 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 02, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 02, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 02, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 02, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 02, 2021 12:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 02, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 02, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 02, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115845 bytes, hash 76567c9b3c08f7375ecaa2b32bb0e58caaf044dafda60051f3f0a0f499e5205e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-dlZ8mzwI9zdeyqKzK7DljKrwRNr9pgBR8_Cg9JnlIF4.pb
    Sep 02, 2021 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 02, 2021 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ytej5qC_OvUfqwQf0G2BZteGa2RtQ9zU6RVC-b8HFUs.jar
    Sep 02, 2021 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9077390707349316124.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-NVLbgXe5SDIPfExZCVwKo5LibYqFGsCrGxScVrVnQh8.jar
    Sep 02, 2021 12:45:13 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 02, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 02, 2021 12:45:13 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 02, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 02, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 02, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 02, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 02, 2021 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-01_17_45_13-4821841411323054159?project=apache-beam-testing
    Sep 02, 2021 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-01_17_45_13-4821841411323054159
    Sep 02, 2021 12:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-01_17_45_13-4821841411323054159
    Sep 02, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-02T00:45:17.259Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 02, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:45:34.299Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 02, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:45:35.146Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 02, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:45:35.213Z: Expanding GroupByKey operations into optimizable parts.
    Sep 02, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:45:35.253Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 02, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:45:35.364Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 02, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:45:35.402Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 02, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:45:35.444Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 02, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:45:35.475Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 02, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:45:35.926Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:45:36.005Z: Starting 5 workers in us-central1-c...
    Sep 02, 2021 12:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:46:01.640Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 02, 2021 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:46:11.287Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 02, 2021 12:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:46:11.318Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 02, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:46:21.639Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 02, 2021 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:46:46.908Z: Workers have started successfully.
    Sep 02, 2021 12:46:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:46:46.924Z: Workers have started successfully.
    Sep 02, 2021 12:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:47:19.784Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 02, 2021 12:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:47:19.966Z: Cleaning up.
    Sep 02, 2021 12:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:47:20.076Z: Stopping worker pool...
    Sep 02, 2021 12:49:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:49:34.585Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 02, 2021 12:49:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-02T00:49:34.634Z: Worker pool stopped.
    Sep 02, 2021 12:49:40 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-01_17_45_13-4821841411323054159 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b7e25b00-a2c7-48ce-aaeb-0c659d042c75 and timestamp: 2021-09-02T00:49:40.606000000Z:
                     Metric:                    Value:
                   read_time                     7.633
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 02, 2021 12:49:40 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 47.093 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 19s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/sasc4sdejlv6i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2373

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2373/display/redirect?page=changes>

Changes:

[baeminbo] Fix apiclient_test unittest not to fail with no credentials

[noreply] [BEAM-10913] - Adding new Grafana dashboard to monitor GAs post-commit


------------------------------------------
[...truncated 347.34 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2021 6:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 01, 2021 6:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 01, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 01, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 01, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115844 bytes, hash 742b9993c5d0e3f00f646447e0ede3268e435e619b34dfc1172b7322c3824cac> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-dCuZk8XQ4_APZGRH4O3jJo5DXmGbNN_BFytzIsOCTKw.pb
    Sep 01, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 01, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ytej5qC_OvUfqwQf0G2BZteGa2RtQ9zU6RVC-b8HFUs.jar
    Sep 01, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test787047889011239436.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-X-MncI8cINWxxqL1ht0Dax0a18hGRtZMP8BCw4nQmis.jar
    Sep 01, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 01, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 01, 2021 6:45:11 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 01, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 01, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 01, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 01, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 01, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-01_11_45_12-6850683916748846305?project=apache-beam-testing
    Sep 01, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-01_11_45_12-6850683916748846305
    Sep 01, 2021 6:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-01_11_45_12-6850683916748846305
    Sep 01, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-01T18:45:15.715Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 01, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:22.406Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 01, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:23.039Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 01, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:23.066Z: Expanding GroupByKey operations into optimizable parts.
    Sep 01, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:23.093Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 01, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:23.166Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 01, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:23.195Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 01, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:23.222Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 01, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:23.251Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 01, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:23.573Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:23.641Z: Starting 5 workers in us-central1-c...
    Sep 01, 2021 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:34.937Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 01, 2021 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:54.238Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 01, 2021 6:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:46:54.265Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 01, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:47:04.533Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 01, 2021 6:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:47:28.938Z: Workers have started successfully.
    Sep 01, 2021 6:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:47:28.966Z: Workers have started successfully.
    Sep 01, 2021 6:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:47:58.287Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2021 6:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:47:58.429Z: Cleaning up.
    Sep 01, 2021 6:47:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:47:58.498Z: Stopping worker pool...
    Sep 01, 2021 6:50:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:50:17.813Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 01, 2021 6:50:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T18:50:17.856Z: Worker pool stopped.
    Sep 01, 2021 6:55:38 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-01_11_45_12-6850683916748846305 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 32a89e32-15ca-43ba-a200-1d45cd051fb6 and timestamp: 2021-09-01T18:55:38.389000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.065

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2021 6:55:38 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 10 mins 43.624 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 15s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/kdeuptchy5ies

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2372

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2372/display/redirect>

Changes:


------------------------------------------
[...truncated 346.00 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@241640660]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 01, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 01, 2021 12:44:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 01, 2021 12:44:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 01, 2021 12:44:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115842 bytes, hash d9d35a174b28f16cddb61e2dc2db9d5f7d9fa9ec766455f1cbde89103ac82ad1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2dNaF0so8Wzdth4twtudX32fqex2ZFXxy96JEDrIKtE.pb
    Sep 01, 2021 12:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 01, 2021 12:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ytej5qC_OvUfqwQf0G2BZteGa2RtQ9zU6RVC-b8HFUs.jar
    Sep 01, 2021 12:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1716425767701768398.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-c913HUNVArZdB6xd4nO0-ZZPqeFqgk9aBqxLWDyCpNU.jar
    Sep 01, 2021 12:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 01, 2021 12:44:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 01, 2021 12:44:52 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 01, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 01, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 01, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 01, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 01, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-01_05_44_53-11170188808383602487?project=apache-beam-testing
    Sep 01, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-09-01_05_44_53-11170188808383602487
    Sep 01, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-01_05_44_53-11170188808383602487
    Sep 01, 2021 12:44:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-01T12:44:56.846Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 01, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:04.640Z: Worker configuration: e2-standard-2 in us-central1-b.
    Sep 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:05.424Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:05.471Z: Expanding GroupByKey operations into optimizable parts.
    Sep 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:05.494Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:05.569Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:05.597Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:05.630Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:05.681Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:06.064Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:06.134Z: Starting 5 workers in us-central1-b...
    Sep 01, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:34.260Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 01, 2021 12:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:45:58.037Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 01, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:46:24.293Z: Workers have started successfully.
    Sep 01, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:46:24.325Z: Workers have started successfully.
    Sep 01, 2021 12:46:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:46:57.918Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2021 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:46:58.044Z: Cleaning up.
    Sep 01, 2021 12:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:46:58.116Z: Stopping worker pool...
    Sep 01, 2021 12:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:49:19.452Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 01, 2021 12:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T12:49:19.487Z: Worker pool stopped.
    Sep 01, 2021 12:49:25 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-09-01_05_44_53-11170188808383602487 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 157b38e9-4c5b-4ec0-9a8d-d9032ed44b17 and timestamp: 2021-09-01T12:49:25.423000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.011

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2021 12:49:25 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 31 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.001 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker Thread 10,5,main]) completed. Took 4 mins 49.953 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 6s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/ktwa5i3oml6qg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2371

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2371/display/redirect?page=changes>

Changes:

[noreply] Add per-batch metrics to JdbcIO.write (#15429)


------------------------------------------
[...truncated 350.09 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 01, 2021 6:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 01, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 01, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 01, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115846 bytes, hash 5a58088714df9d74d12b849b6bf8d3f7929851ee8684ea5bb4c343873771f563> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-WlgIhxTfnXTRK4Sba_jT95KYUe6GhOpbtMNDhzdx9WM.pb
    Sep 01, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 01, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ytej5qC_OvUfqwQf0G2BZteGa2RtQ9zU6RVC-b8HFUs.jar
    Sep 01, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1708190551259261885.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-__2AAf5i5MHu44rrO1kIz8EUfz9n6G1kz7rmUUjYSTE.jar
    Sep 01, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Sep 01, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 01, 2021 6:45:08 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 01, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 01, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 01, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 01, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 01, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-31_23_45_09-10104240802867659758?project=apache-beam-testing
    Sep 01, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-31_23_45_09-10104240802867659758
    Sep 01, 2021 6:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-31_23_45_09-10104240802867659758
    Sep 01, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-01T06:45:12.957Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 01, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:19.982Z: Worker configuration: e2-standard-2 in us-central1-c.
    Sep 01, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:20.741Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 01, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:20.783Z: Expanding GroupByKey operations into optimizable parts.
    Sep 01, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:20.816Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 01, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:20.872Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 01, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:20.909Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 01, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:20.933Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 01, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:20.984Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 01, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:21.359Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:21.430Z: Starting 5 workers in us-central1-c...
    Sep 01, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:39.333Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 01, 2021 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:55.035Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Sep 01, 2021 6:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:45:55.055Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Sep 01, 2021 6:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:46:05.338Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 01, 2021 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:46:34.096Z: Workers have started successfully.
    Sep 01, 2021 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:46:34.134Z: Workers have started successfully.
    Sep 01, 2021 6:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:47:16.282Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2021 6:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:47:16.441Z: Cleaning up.
    Sep 01, 2021 6:47:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:47:16.529Z: Stopping worker pool...
    Sep 01, 2021 6:49:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:49:33.253Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 01, 2021 6:49:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T06:49:33.296Z: Worker pool stopped.
    Sep 01, 2021 6:49:38 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-31_23_45_09-10104240802867659758 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c1e997a6-e932-4fba-91b2-38ef40206f97 and timestamp: 2021-09-01T06:49:38.837000000Z:
                     Metric:                    Value:
                   read_time                     9.501
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2021 6:49:39 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 28 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.004 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 45.966 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 18s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/kdyjdzou4d2y2

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2370

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2370/display/redirect?page=changes>

Changes:

[emilyye] sync nltk, orjson for Python image

[noreply] Fix typo in BigQuery documentation

[Steve Niemitz] [BEAM-12767] Improve PipelineOption parsing UX


------------------------------------------
[...truncated 363.38 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@107388924]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2021 12:50:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2021 12:50:49 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2021 12:50:50 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Sep 01, 2021 12:50:50 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Sep 01, 2021 12:50:50 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Sep 01, 2021 12:50:50 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Sep 01, 2021 12:50:51 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Sep 01, 2021 12:50:51 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Sep 01, 2021 12:50:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Sep 01, 2021 12:51:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Sep 01, 2021 12:51:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 04dffd189a663aeb1d21c33673302161ced769c54075735f79258089cb2e7c80> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BN_9GJpmOusdIcM2czAhYc7XacVAdXNfeSWAicsufIA.pb
    Sep 01, 2021 12:51:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Sep 01, 2021 12:51:14 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ytej5qC_OvUfqwQf0G2BZteGa2RtQ9zU6RVC-b8HFUs.jar
    Sep 01, 2021 12:51:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1926280571839756595.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Vy13RwHp0rFhsCHdkvwwnAERLXW6kSBt9-InmEepIgU.jar
    Sep 01, 2021 12:51:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 6 seconds
    Sep 01, 2021 12:51:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Sep 01, 2021 12:51:20 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Sep 01, 2021 12:51:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Sep 01, 2021 12:51:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Sep 01, 2021 12:51:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Sep 01, 2021 12:51:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Sep 01, 2021 12:51:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-31_17_51_22-725356961012626473?project=apache-beam-testing
    Sep 01, 2021 12:51:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-31_17_51_22-725356961012626473
    Sep 01, 2021 12:51:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-31_17_51_22-725356961012626473
    Sep 01, 2021 12:51:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-09-01T00:51:26.063Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Sep 01, 2021 12:51:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:51:31.597Z: Worker configuration: e2-standard-2 in us-central1-a.
    Sep 01, 2021 12:51:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:51:32.277Z: Expanding CoGroupByKey operations into optimizable parts.
    Sep 01, 2021 12:51:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:51:32.327Z: Expanding GroupByKey operations into optimizable parts.
    Sep 01, 2021 12:51:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:51:32.364Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Sep 01, 2021 12:51:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:51:32.436Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Sep 01, 2021 12:51:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:51:32.467Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Sep 01, 2021 12:51:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:51:32.497Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Sep 01, 2021 12:51:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:51:32.521Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Sep 01, 2021 12:51:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:51:32.895Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2021 12:51:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:51:32.989Z: Starting 5 workers in us-central1-a...
    Sep 01, 2021 12:51:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:51:50.636Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 01, 2021 12:52:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:52:16.437Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Sep 01, 2021 12:52:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:52:41.493Z: Workers have started successfully.
    Sep 01, 2021 12:52:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:52:41.525Z: Workers have started successfully.
    Sep 01, 2021 12:53:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:53:10.869Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Sep 01, 2021 12:53:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:53:11.058Z: Cleaning up.
    Sep 01, 2021 12:53:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:53:11.146Z: Stopping worker pool...
    Sep 01, 2021 12:55:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:55:34.870Z: Autoscaling: Resized worker pool from 5 to 0.
    Sep 01, 2021 12:55:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-09-01T00:55:34.914Z: Worker pool stopped.
    Sep 01, 2021 12:55:40 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-31_17_51_22-725356961012626473 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3d4ac1e4-ad37-4db1-80dc-aeb0e9b52b28 and timestamp: 2021-09-01T00:55:40.309000000Z:
                     Metric:                    Value:
                   read_time                     9.725
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Sep 01, 2021 12:55:40 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 5 mins 45.292 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 28s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/gd5smdiymyyk6

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2369

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2369/display/redirect?page=changes>

Changes:

[noreply] Allow `google-auth < 3`

[samuelw] [BEAM-12776] Change closing to happen in background in parallel for

[Luke Cwik] [BEAM-12802] Refactor DataStreamsDecoder so that it becomes aware of the

[ajamato] [BEAM-11994] Instantiate a new ServiceCallMetric before each request to

[Ankur Goenka] Remove duplicate 2.33.0 section

[noreply] add python spark example in documentation (#15426)


------------------------------------------
[...truncated 349.86 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2021 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2021 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2021 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2021 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2021 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2021 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2021 6:45:29 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 31, 2021 6:45:29 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 31, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 31, 2021 6:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 31, 2021 6:45:34 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash 53fe55eb534c97c3be367ddef275c2a5557bd24d155e1114d58431de9bc1245b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-U_5V61NMl8O-Nn3e8nXCpVV70k0VXhEU1YQx3pvBJFs.pb
    Aug 31, 2021 6:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 31, 2021 6:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ytej5qC_OvUfqwQf0G2BZteGa2RtQ9zU6RVC-b8HFUs.jar
    Aug 31, 2021 6:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4538140494941364567.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ESG5dPUQxiWfwe04CRAsabG8j2mqAVNJbjB0nkmAV5U.jar
    Aug 31, 2021 6:45:37 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 31, 2021 6:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 31, 2021 6:45:38 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1263)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 31, 2021 6:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 31, 2021 6:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 31, 2021 6:45:38 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 31, 2021 6:45:38 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 31, 2021 6:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-31_11_45_38-6222153004379213378?project=apache-beam-testing
    Aug 31, 2021 6:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-31_11_45_38-6222153004379213378
    Aug 31, 2021 6:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-31_11_45_38-6222153004379213378
    Aug 31, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-31T18:45:42.005Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 31, 2021 6:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:45:55.522Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 31, 2021 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:45:56.286Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 31, 2021 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:45:56.331Z: Expanding GroupByKey operations into optimizable parts.
    Aug 31, 2021 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:45:56.366Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 31, 2021 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:45:56.437Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 31, 2021 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:45:56.474Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 31, 2021 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:45:56.499Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 31, 2021 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:45:56.533Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 31, 2021 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:45:57.215Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2021 6:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:45:57.376Z: Starting 5 workers in us-central1-c...
    Aug 31, 2021 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:46:28.234Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 31, 2021 6:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:46:38.036Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 31, 2021 6:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:47:04.860Z: Workers have started successfully.
    Aug 31, 2021 6:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:47:04.888Z: Workers have started successfully.
    Aug 31, 2021 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:47:41.616Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2021 6:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:47:41.858Z: Cleaning up.
    Aug 31, 2021 6:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:47:41.945Z: Stopping worker pool...
    Aug 31, 2021 6:49:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:49:58.555Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 31, 2021 6:49:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T18:49:58.596Z: Worker pool stopped.
    Aug 31, 2021 6:50:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-31_11_45_38-6222153004379213378 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): aa63ec86-f6eb-4d83-b1a4-7a984ed0036c and timestamp: 2021-08-31T18:50:10.699000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.937

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2021 6:50:11 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 52.454 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 52s
152 actionable tasks: 99 executed, 53 from cache

Publishing build scan...
https://gradle.com/s/zcklscxinfbuo

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2368

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2368/display/redirect>

Changes:


------------------------------------------
[...truncated 347.47 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 31, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 31, 2021 12:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 31, 2021 12:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 31, 2021 12:45:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash 2a186197c690522e2d250d9c7ea353636754e70dbf3728011ab9ed49ad661a25> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Khhhl8aQUi4tJQ2cfqNTY2dU5w2_NygBGrntSa1mGiU.pb
    Aug 31, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 31, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6058284577247469217.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-gnei1AeVb2nqzuNkfbLFqzv5btPT3qbh_3FV8uiieBI.jar
    Aug 31, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 31, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 31, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 31, 2021 12:45:03 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 31, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 31, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 31, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 31, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 31, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-31_05_45_04-6758584460765859458?project=apache-beam-testing
    Aug 31, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-31_05_45_04-6758584460765859458
    Aug 31, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-31_05_45_04-6758584460765859458
    Aug 31, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-31T12:45:08.069Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 31, 2021 12:45:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:45:11.946Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 31, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:45:12.650Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 31, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:45:12.692Z: Expanding GroupByKey operations into optimizable parts.
    Aug 31, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:45:12.750Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 31, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:45:12.817Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 31, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:45:12.843Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 31, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:45:12.878Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 31, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:45:12.905Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 31, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:45:13.258Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:45:13.337Z: Starting 5 workers in us-central1-a...
    Aug 31, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:45:26.219Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 31, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:46:01.999Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 31, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:46:27.731Z: Workers have started successfully.
    Aug 31, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:46:27.782Z: Workers have started successfully.
    Aug 31, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:46:55.930Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:46:56.047Z: Cleaning up.
    Aug 31, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:46:56.145Z: Stopping worker pool...
    Aug 31, 2021 12:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:49:18.497Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 31, 2021 12:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T12:49:18.538Z: Worker pool stopped.
    Aug 31, 2021 12:49:23 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-31_05_45_04-6758584460765859458 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2909e591-504c-4cf7-be33-1b300608311c and timestamp: 2021-08-31T12:49:23.771000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.523

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2021 12:49:24 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 35.948 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 4s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/q4vr2o7t3afis

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2367

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2367/display/redirect>

Changes:


------------------------------------------
[...truncated 347.48 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 31, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 31, 2021 6:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 31, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 31, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 89fc807e3b92450df9e203199aa9b3942a221a8a93c20af9c07558f891995fd3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ifyAfjuSRQ354gMZmqmzlCoiGoqTwgr5wHVY-JGZX9M.pb
    Aug 31, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 31, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 31, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4807974909720718542.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Qbl31IJ-oIxIbKsDKcSxtV8Wr41G1Mp9GGQcbuozjCs.jar
    Aug 31, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 31, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 31, 2021 6:45:07 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 31, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 31, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 31, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 31, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 31, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-30_23_45_07-14412756047924426894?project=apache-beam-testing
    Aug 31, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-30_23_45_07-14412756047924426894
    Aug 31, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-30_23_45_07-14412756047924426894
    Aug 31, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-31T06:45:10.800Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 31, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:45:17.282Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 31, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:45:18.222Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 31, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:45:18.299Z: Expanding GroupByKey operations into optimizable parts.
    Aug 31, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:45:18.329Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 31, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:45:18.391Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 31, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:45:18.416Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 31, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:45:18.450Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 31, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:45:18.477Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 31, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:45:18.773Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:45:18.847Z: Starting 5 workers in us-central1-c...
    Aug 31, 2021 6:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:45:44.377Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 31, 2021 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:46:00.321Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 31, 2021 6:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:46:25.563Z: Workers have started successfully.
    Aug 31, 2021 6:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:46:25.590Z: Workers have started successfully.
    Aug 31, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:46:55.235Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:46:55.386Z: Cleaning up.
    Aug 31, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:46:55.461Z: Stopping worker pool...
    Aug 31, 2021 6:49:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:49:10.146Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 31, 2021 6:49:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T06:49:10.194Z: Worker pool stopped.
    Aug 31, 2021 6:49:17 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-30_23_45_07-14412756047924426894 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0698a1ce-6f39-449a-acc7-8e3827caa078 and timestamp: 2021-08-31T06:49:17.793000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.824

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2021 6:49:18 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 28.938 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 59s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/5rupyd2krk5no

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2366

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2366/display/redirect?page=changes>

Changes:

[Andrew Pilloud] Docs are built in build_release_candidate.sh

[Andrew Pilloud] Add pypy to email

[Andrew Pilloud] Update CHANGES.md along with website

[Ankur Goenka] Fix Change log for Dataframe preview

[kawaigin] [BEAM-10708] Added beam_sql magics

[noreply] Merge pull request #15007 from [BEAM-12428] Implement


------------------------------------------
[...truncated 361.89 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@84917236]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2021 12:49:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2021 12:49:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2021 12:49:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 31, 2021 12:49:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 31, 2021 12:49:53 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 31, 2021 12:49:53 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 31, 2021 12:49:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 31, 2021 12:49:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 31, 2021 12:49:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 31, 2021 12:50:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 31, 2021 12:50:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115845 bytes, hash a5df514dacbb7a5ac788de28781016eb80c62d09f198c11f0d0f1fae027e0ff3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pd9RTay7elrHiN4oeBAW64DGLQnxmMEfDQ8frgJ-D_M.pb
    Aug 31, 2021 12:50:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 31, 2021 12:50:11 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 31, 2021 12:50:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test115956588323711959.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-_-L5644iT54dDNBcYNirh-JRj8dptdDY0DnbEqo8Oyc.jar
    Aug 31, 2021 12:50:16 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 4 seconds
    Aug 31, 2021 12:50:16 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 31, 2021 12:50:16 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 31, 2021 12:50:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 31, 2021 12:50:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 31, 2021 12:50:17 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 31, 2021 12:50:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 31, 2021 12:50:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-30_17_50_17-6575149304075824110?project=apache-beam-testing
    Aug 31, 2021 12:50:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-30_17_50_17-6575149304075824110
    Aug 31, 2021 12:50:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-30_17_50_17-6575149304075824110
    Aug 31, 2021 12:50:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-31T00:50:21.533Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 31, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:50:29.173Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 31, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:50:29.871Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 31, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:50:29.914Z: Expanding GroupByKey operations into optimizable parts.
    Aug 31, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:50:29.940Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 31, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:50:30.042Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 31, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:50:30.069Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 31, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:50:30.103Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 31, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:50:30.135Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 31, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:50:30.496Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:50:30.588Z: Starting 5 workers in us-central1-b...
    Aug 31, 2021 12:50:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:50:44.456Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 31, 2021 12:51:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:51:15.709Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 31, 2021 12:51:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:51:40.872Z: Workers have started successfully.
    Aug 31, 2021 12:51:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:51:40.902Z: Workers have started successfully.
    Aug 31, 2021 12:52:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:52:08.822Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 31, 2021 12:52:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:52:09.150Z: Cleaning up.
    Aug 31, 2021 12:52:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:52:09.262Z: Stopping worker pool...
    Aug 31, 2021 12:54:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:54:35.815Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 31, 2021 12:54:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-31T00:54:35.856Z: Worker pool stopped.
    Aug 31, 2021 12:54:42 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-30_17_50_17-6575149304075824110 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6e71396c-b534-484f-be4f-80bd3895490f and timestamp: 2021-08-31T00:54:42.640000000Z:
                     Metric:                    Value:
                   read_time                      6.85
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 31, 2021 12:54:43 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.089 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.109 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 5 mins 34.923 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 54s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/bxswdzullocm4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2365

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2365/display/redirect?page=changes>

Changes:

[Kyle Weaver] Revert "Merge pull request #15271 Decreasing peak memory usage for

[noreply] [BEAM-11097] Refactor Side Input opening to abstract away from ParDo

[Kyle Weaver] [BEAM-12820] Fix null check error


------------------------------------------
[...truncated 348.32 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1883789984]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2021 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2021 6:45:15 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2021 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2021 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2021 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2021 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2021 6:45:16 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 30, 2021 6:45:16 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 30, 2021 6:45:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 30, 2021 6:45:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 30, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash a5a7e51e6fb7c25413a51acf8e91902672c1fe2d45255638fadb8107fcac0fe6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-paflHm-3wlQTpRrPjpGQJnLB_i1FJVY4-tuBB_ysD-Y.pb
    Aug 30, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 30, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 30, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3492696764925976511.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-xOdFwNNuGHtUnhis29nphbA0pBkJM3dGtHupPxb1fGc.jar
    Aug 30, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 2 seconds
    Aug 30, 2021 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 30, 2021 6:45:25 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 30, 2021 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 30, 2021 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 30, 2021 6:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 30, 2021 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 30, 2021 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-30_11_45_26-16876456216403520199?project=apache-beam-testing
    Aug 30, 2021 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-30_11_45_26-16876456216403520199
    Aug 30, 2021 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-30_11_45_26-16876456216403520199
    Aug 30, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-30T18:45:29.518Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 30, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:45:38.478Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 30, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:45:39.244Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 30, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:45:39.285Z: Expanding GroupByKey operations into optimizable parts.
    Aug 30, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:45:39.311Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 30, 2021 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:45:39.445Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 30, 2021 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:45:39.496Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 30, 2021 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:45:39.531Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 30, 2021 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:45:39.568Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 30, 2021 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:45:39.981Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2021 6:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:45:40.055Z: Starting 5 workers in us-central1-c...
    Aug 30, 2021 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:46:04.832Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 30, 2021 6:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:46:20.153Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 30, 2021 6:46:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:46:45.685Z: Workers have started successfully.
    Aug 30, 2021 6:46:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:46:45.718Z: Workers have started successfully.
    Aug 30, 2021 6:47:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:47:17.878Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2021 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:47:18.020Z: Cleaning up.
    Aug 30, 2021 6:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:47:18.084Z: Stopping worker pool...
    Aug 30, 2021 6:49:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:49:38.386Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 30, 2021 6:49:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T18:49:38.413Z: Worker pool stopped.
    Aug 30, 2021 6:49:44 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-30_11_45_26-16876456216403520199 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): df61a884-8887-4541-97b0-cac75b3bbfce and timestamp: 2021-08-30T18:49:44.562000000Z:
                     Metric:                    Value:
                   read_time                    10.197
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2021 6:49:45 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.039 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 41.874 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 22s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/rgqx66hxrtqs6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2364

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2364/display/redirect>

Changes:


------------------------------------------
[...truncated 347.50 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 30, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 30, 2021 12:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 30, 2021 12:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 30, 2021 12:44:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash 19ac4b39a7ec82d103505d6d686e40edda64c3fa05b8403caced0cff670c5dd6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-GaxLOafsgtEDUF1taG5A7dpkw_oFuEA8rO0M_2cMXdY.pb
    Aug 30, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 30, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5827554993625789155.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-cWYtpQJeFf5aHUaDWUrWpLtRfTwPJEv-HUuMShj7OKM.jar
    Aug 30, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 30, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 30, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 30, 2021 12:45:02 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 30, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 30, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 30, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 30, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 30, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-30_05_45_02-1712923631587055536?project=apache-beam-testing
    Aug 30, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-30_05_45_02-1712923631587055536
    Aug 30, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-30_05_45_02-1712923631587055536
    Aug 30, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-30T12:45:06.613Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 30, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:13.129Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:13.914Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:13.965Z: Expanding GroupByKey operations into optimizable parts.
    Aug 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:14.000Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:14.073Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:14.100Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:14.126Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:14.149Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:14.469Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:14.546Z: Starting 5 workers in us-central1-a...
    Aug 30, 2021 12:45:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:42.358Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 30, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:45:57.784Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 30, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:46:25.095Z: Workers have started successfully.
    Aug 30, 2021 12:46:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:46:25.125Z: Workers have started successfully.
    Aug 30, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:46:53.264Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:46:53.420Z: Cleaning up.
    Aug 30, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:46:53.495Z: Stopping worker pool...
    Aug 30, 2021 12:49:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:49:15.194Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 30, 2021 12:49:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T12:49:15.221Z: Worker pool stopped.
    Aug 30, 2021 12:49:21 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-30_05_45_02-1712923631587055536 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 91e4e1bd-b6a3-47d8-ab05-199e88f7b67d and timestamp: 2021-08-30T12:49:21.899000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     10.45

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2021 12:49:22 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 35.501 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 4s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/v3gqn4qb6fq7e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2363

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2363/display/redirect>

Changes:


------------------------------------------
[...truncated 347.61 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 30, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 30, 2021 6:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 30, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 30, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115849 bytes, hash bd72211ecb8275b0aff0c0359a210d1a764fb903fa7a74eb821f2c7c6d4363e5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vXIhHsuCdbCv8MA1miENGnZPuQP6enTrgh8sfG1DY-U.pb
    Aug 30, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 30, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 30, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2970509805631642416.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-d2emmBsQxz7fhMEni5AJ-7WbjHbUon9YLJYoeL7RYE8.jar
    Aug 30, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 30, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 30, 2021 6:45:06 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 30, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 30, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 30, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 30, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 30, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-29_23_45_07-6235665141353140311?project=apache-beam-testing
    Aug 30, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-29_23_45_07-6235665141353140311
    Aug 30, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-29_23_45_07-6235665141353140311
    Aug 30, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-30T06:45:10.532Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 30, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:45:16.698Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 30, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:45:17.395Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 30, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:45:17.434Z: Expanding GroupByKey operations into optimizable parts.
    Aug 30, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:45:17.460Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 30, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:45:17.532Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 30, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:45:17.556Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 30, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:45:17.589Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 30, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:45:17.622Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 30, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:45:17.970Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:45:18.045Z: Starting 5 workers in us-central1-c...
    Aug 30, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:45:46.268Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 30, 2021 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:46:07.452Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 30, 2021 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:46:33.122Z: Workers have started successfully.
    Aug 30, 2021 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:46:33.149Z: Workers have started successfully.
    Aug 30, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:47:02.451Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:47:02.594Z: Cleaning up.
    Aug 30, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:47:02.660Z: Stopping worker pool...
    Aug 30, 2021 6:49:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:49:29.446Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 30, 2021 6:49:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T06:49:29.489Z: Worker pool stopped.
    Aug 30, 2021 6:49:34 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-29_23_45_07-6235665141353140311 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 68650ad4-cf10-4774-ac75-64e0cd7b7f9f and timestamp: 2021-08-30T06:49:34.709000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.503

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2021 6:49:35 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 45.168 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 17s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/tzaayje2lrayu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2362

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2362/display/redirect>

Changes:


------------------------------------------
[...truncated 347.79 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1883789984]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 30, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 30, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 30, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 30, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 30, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 30, 2021 12:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 30, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 30, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash a1bdf8efafe7632500a1b1a161eec63ad9f9fe2808ab75742a982029dd45ffb6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ob3476_nYyUAobGhYe7GOtn5_igIq3V0KpggKd1F_7Y.pb
    Aug 30, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 30, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 30, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5091888761676821795.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-EVUCVNo7GZsta58ZkFd2281nbAoE6PfKiiNPtdosvRs.jar
    Aug 30, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 30, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 30, 2021 12:45:07 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 30, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 30, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 30, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 30, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 30, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-29_17_45_08-14071765235140919825?project=apache-beam-testing
    Aug 30, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-29_17_45_08-14071765235140919825
    Aug 30, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-29_17_45_08-14071765235140919825
    Aug 30, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-30T00:45:13.169Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 30, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:45:19.467Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 30, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:45:20.137Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 30, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:45:20.166Z: Expanding GroupByKey operations into optimizable parts.
    Aug 30, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:45:20.195Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 30, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:45:20.289Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 30, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:45:20.316Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 30, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:45:20.350Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 30, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:45:20.372Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 30, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:45:20.699Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:45:20.771Z: Starting 5 workers in us-central1-b...
    Aug 30, 2021 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:45:42.487Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 30, 2021 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:46:00.598Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 30, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:46:25.402Z: Workers have started successfully.
    Aug 30, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:46:25.432Z: Workers have started successfully.
    Aug 30, 2021 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:46:55.653Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 30, 2021 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:46:55.747Z: Cleaning up.
    Aug 30, 2021 12:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:46:55.806Z: Stopping worker pool...
    Aug 30, 2021 12:49:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:49:16.280Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 30, 2021 12:49:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-30T00:49:16.315Z: Worker pool stopped.
    Aug 30, 2021 12:49:22 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-29_17_45_08-14071765235140919825 finished with status DONE.


Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e81552dd-ecff-4ce1-90e9-85ec1e75683d and timestamp: 2021-08-30T00:49:22.072000000Z:
                     Metric:                    Value:
                   read_time                     8.551
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 30, 2021 12:49:22 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 33.228 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 3s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/sncl3pet6ad6c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2361

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2361/display/redirect>

Changes:


------------------------------------------
[...truncated 348.02 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2021 6:44:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2021 6:44:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2021 6:44:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2021 6:44:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2021 6:44:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2021 6:44:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2021 6:44:42 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 29, 2021 6:44:42 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 29, 2021 6:44:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 29, 2021 6:44:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 29, 2021 6:44:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 912ef38e7a63d8131c0ba4cde5e68beb867119ee7d437f5d7e46c139446ce371> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-kS7zjnpj2BMcC6TN5eaL64ZxGe59Q39dfkbBOURs43E.pb
    Aug 29, 2021 6:44:48 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 29, 2021 6:44:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6037816515763378135.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-EO-FDs9Z1tDVQuyDxiTxXso4rEpfTpmP8NLzeU8eobg.jar
    Aug 29, 2021 6:44:48 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 29, 2021 6:44:48 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 29, 2021 6:44:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 29, 2021 6:44:49 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 29, 2021 6:44:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 29, 2021 6:44:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 29, 2021 6:44:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 29, 2021 6:44:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 29, 2021 6:44:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-29_11_44_49-17273808306904982704?project=apache-beam-testing
    Aug 29, 2021 6:44:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-29_11_44_49-17273808306904982704
    Aug 29, 2021 6:44:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-29_11_44_49-17273808306904982704
    Aug 29, 2021 6:44:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-29T18:44:52.819Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 29, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:44:58.528Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 29, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:44:59.152Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 29, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:44:59.177Z: Expanding GroupByKey operations into optimizable parts.
    Aug 29, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:44:59.198Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 29, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:44:59.242Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 29, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:44:59.266Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 29, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:44:59.292Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 29, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:44:59.315Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 29, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:44:59.551Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:44:59.603Z: Starting 5 workers in us-central1-a...
    Aug 29, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:45:20.495Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 29, 2021 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:45:55.411Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 29, 2021 6:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:46:21.130Z: Workers have started successfully.
    Aug 29, 2021 6:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:46:21.166Z: Workers have started successfully.
    Aug 29, 2021 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:46:48.953Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2021 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:46:49.083Z: Cleaning up.
    Aug 29, 2021 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:46:49.147Z: Stopping worker pool...
    Aug 29, 2021 6:49:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:49:07.307Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 29, 2021 6:49:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T18:49:07.354Z: Worker pool stopped.
    Aug 29, 2021 6:49:13 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-29_11_44_49-17273808306904982704 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d2737baf-dc79-437e-808f-af35d61cd443 and timestamp: 2021-08-29T18:49:13.152000000Z:
                     Metric:                    Value:
                   read_time                       8.3
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2021 6:49:13 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 9 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker Thread 18,5,main]) completed. Took 4 mins 40.215 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 55s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/zxeg2jteudwxa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2360

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2360/display/redirect>

Changes:


------------------------------------------
[...truncated 345.73 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2021 12:44:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2021 12:44:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2021 12:44:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2021 12:44:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2021 12:44:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2021 12:44:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2021 12:44:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 29, 2021 12:44:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 29, 2021 12:44:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 29, 2021 12:44:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 29, 2021 12:44:50 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 15d2e598ce9a071c730d92fa6365cbd6f636be5a1ecb58b2a24d49691dc801ef> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-FdLlmM6aBxxzDZL6Y2XL1vY2vloey1iyok1JaR3IAe8.pb
    Aug 29, 2021 12:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 29, 2021 12:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 29, 2021 12:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5315297517791887076.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-5o9da5VhA2xpC28tutb_XU9qzeRVWDCQJ0uE3n-rDzs.jar
    Aug 29, 2021 12:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 29, 2021 12:44:52 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 29, 2021 12:44:52 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 29, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 29, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 29, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 29, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 29, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-29_05_44_53-4433002800372859138?project=apache-beam-testing
    Aug 29, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-29_05_44_53-4433002800372859138
    Aug 29, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-29_05_44_53-4433002800372859138
    Aug 29, 2021 12:44:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-29T12:44:57.013Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 29, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:02.385Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 29, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:03.069Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 29, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:03.098Z: Expanding GroupByKey operations into optimizable parts.
    Aug 29, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:03.117Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 29, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:03.189Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 29, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:03.217Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 29, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:03.251Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 29, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:03.285Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 29, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:03.609Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:03.680Z: Starting 5 workers in us-central1-a...
    Aug 29, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:25.254Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 29, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:45:47.209Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 29, 2021 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:46:12.252Z: Workers have started successfully.
    Aug 29, 2021 12:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:46:12.277Z: Workers have started successfully.
    Aug 29, 2021 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:46:42.181Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2021 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:46:42.312Z: Cleaning up.
    Aug 29, 2021 12:46:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:46:42.386Z: Stopping worker pool...
    Aug 29, 2021 12:49:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:49:06.468Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 29, 2021 12:49:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T12:49:06.506Z: Worker pool stopped.
    Aug 29, 2021 12:49:12 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-29_05_44_53-4433002800372859138 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f2e7eb65-d627-4a96-a104-631bbb1b8a15 and timestamp: 2021-08-29T12:49:12.527000000Z:
                     Metric:                    Value:
                   read_time                      9.48
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2021 12:49:12 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 6 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.001 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.003 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker Thread 13,5,main]) completed. Took 4 mins 37.366 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 53s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/fsfubn6zwihac

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2359

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2359/display/redirect>

Changes:


------------------------------------------
[...truncated 346.95 KB...]
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2021 6:44:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2021 6:44:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2021 6:44:46 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2021 6:44:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2021 6:44:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2021 6:44:46 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2021 6:44:46 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 29, 2021 6:44:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 29, 2021 6:44:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 29, 2021 6:44:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 29, 2021 6:44:50 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115844 bytes, hash 43a14aaf3552e16f7f0e31949c4877f7bc6f544e2e7d19b6a11cca43c0842772> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Q6FKrzVS4W9_DjGUnEh397xvVE4ufRm2oRzKQ8CEJ3I.pb
    Aug 29, 2021 6:44:52 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 29, 2021 6:44:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2986711837879567432.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VgKlLSdLmcAncY2iN1zuk66kSoon6GpUmqsB5hgUNqc.jar
    Aug 29, 2021 6:44:52 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 29, 2021 6:44:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.thrift/libthrift/0.14.1/85348a0c44c298bbec5ae747e67ae12e60b3aef6/libthrift-0.14.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/libthrift-0.14.1-WzUQ_nLm8HJeKc7269seq6zMxp15_E7VC2gWAKh2Z-w.jar
    Aug 29, 2021 6:44:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.tomcat.embed/tomcat-embed-core/8.5.46/5d686394334d143f48251827435ab086a161e75e/tomcat-embed-core-8.5.46.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tomcat-embed-core-8.5.46-vl-FREjS7l1uADb-srT3ExYweaG2uaepdQjlWRetNcI.jar
    Aug 29, 2021 6:44:52 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.tomcat/tomcat-annotations-api/8.5.46/56c67699de192c603afd6f029e80e5ff8d98e7e9/tomcat-annotations-api-8.5.46.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tomcat-annotations-api-8.5.46-amtG0OaVhkRRTAyjZYs7B-YSOmgqIO4203lSQnNfq8M.jar
    Aug 29, 2021 6:44:53 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 244 files cached, 4 files newly uploaded in 0 seconds
    Aug 29, 2021 6:44:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 29, 2021 6:44:53 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 29, 2021 6:44:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 29, 2021 6:44:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 29, 2021 6:44:53 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 29, 2021 6:44:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 29, 2021 6:44:54 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-28_23_44_53-4430327611449575348?project=apache-beam-testing
    Aug 29, 2021 6:44:54 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-28_23_44_53-4430327611449575348
    Aug 29, 2021 6:44:54 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-28_23_44_53-4430327611449575348
    Aug 29, 2021 6:44:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-29T06:44:56.906Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 29, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:02.644Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 29, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:03.351Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 29, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:03.407Z: Expanding GroupByKey operations into optimizable parts.
    Aug 29, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:03.435Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 29, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:03.507Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 29, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:03.527Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 29, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:03.554Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 29, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:03.575Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 29, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:03.875Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:03.967Z: Starting 5 workers in us-central1-c...
    Aug 29, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:19.488Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 29, 2021 6:45:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:45:49.073Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 29, 2021 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:46:14.403Z: Workers have started successfully.
    Aug 29, 2021 6:46:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:46:14.437Z: Workers have started successfully.
    Aug 29, 2021 6:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:46:44.451Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2021 6:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:46:44.575Z: Cleaning up.
    Aug 29, 2021 6:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:46:44.644Z: Stopping worker pool...
    Aug 29, 2021 6:49:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:49:06.501Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 29, 2021 6:49:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T06:49:06.671Z: Worker pool stopped.
    Aug 29, 2021 6:49:13 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-28_23_44_53-4430327611449575348 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 970b0caf-9d70-4c9b-8f27-02e5aca0bed0 and timestamp: 2021-08-29T06:49:13.330000000Z:
                     Metric:                    Value:
                   read_time                     6.922
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2021 6:49:13 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.006 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.004 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 35.834 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 52s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/2eeevje7uws5u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2358

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2358/display/redirect>

Changes:


------------------------------------------
[...truncated 348.18 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 29, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 29, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 29, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 29, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 29, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 29, 2021 12:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 29, 2021 12:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 29, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115849 bytes, hash 2081484dff09d1f78bab93a9f28c966a637bfab20062e5ffe6836731a5641546> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IIFITf8J0feLq5Op8oyWamN7-rIAYuX_5oNnMaVkFUY.pb
    Aug 29, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 29, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 29, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1657859403254417456.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ebcqR1hqxlqCtLRVGRQhltTm-IDhag31AZ8ii5FQj_A.jar
    Aug 29, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 29, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 29, 2021 12:45:04 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 29, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 29, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 29, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 29, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 29, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-28_17_45_04-939539157636056372?project=apache-beam-testing
    Aug 29, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-28_17_45_04-939539157636056372
    Aug 29, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-28_17_45_04-939539157636056372
    Aug 29, 2021 12:45:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-29T00:45:08.138Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 29, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:45:14.761Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 29, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:45:15.435Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 29, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:45:15.478Z: Expanding GroupByKey operations into optimizable parts.
    Aug 29, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:45:15.517Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 29, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:45:15.595Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 29, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:45:15.626Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 29, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:45:15.658Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 29, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:45:15.692Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 29, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:45:16.088Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2021 12:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:45:16.169Z: Starting 5 workers in us-central1-c...
    Aug 29, 2021 12:45:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:45:44.738Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 29, 2021 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:46:00.280Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 29, 2021 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:46:26.912Z: Workers have started successfully.
    Aug 29, 2021 12:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:46:26.943Z: Workers have started successfully.
    Aug 29, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:47:01.094Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 29, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:47:01.241Z: Cleaning up.
    Aug 29, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:47:01.330Z: Stopping worker pool...
    Aug 29, 2021 12:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:49:25.546Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 29, 2021 12:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-29T00:49:25.602Z: Worker pool stopped.
    Aug 29, 2021 12:49:31 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-28_17_45_04-939539157636056372 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): baee545f-7f9f-458f-9d07-254ca7f520a5 and timestamp: 2021-08-29T00:49:31.392000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.415

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 29, 2021 12:49:31 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 42.816 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 12s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/wp6g6s2gnioqa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2357

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2357/display/redirect>

Changes:


------------------------------------------
[...truncated 347.63 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1883789984]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2021 6:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 28, 2021 6:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 28, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 28, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 28, 2021 6:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash c91ef291937900c7e6793448d4b45dd0b173648f48d52dd16eb5784fbaf8fb5c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-yR7ykZN5AMfmeTRI1LRd0LFzZI9I1S3RbrV4T7r4-1w.pb
    Aug 28, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 28, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 28, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1259297199863480765.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qYg5VbjwsDzyguT-iiYMWEQ4X2ZGNoaN3YNQ0-97U80.jar
    Aug 28, 2021 6:45:14 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 28, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 28, 2021 6:45:14 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 28, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 28, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 28, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 28, 2021 6:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 28, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-28_11_45_14-2462466162434520122?project=apache-beam-testing
    Aug 28, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-28_11_45_14-2462466162434520122
    Aug 28, 2021 6:45:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-28_11_45_14-2462466162434520122
    Aug 28, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-28T18:45:17.974Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 28, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:45:23.604Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 28, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:45:24.351Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 28, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:45:24.382Z: Expanding GroupByKey operations into optimizable parts.
    Aug 28, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:45:24.397Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 28, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:45:24.450Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 28, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:45:24.474Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 28, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:45:24.500Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 28, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:45:24.522Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 28, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:45:24.755Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:45:24.816Z: Starting 5 workers in us-central1-a...
    Aug 28, 2021 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:45:36.738Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 28, 2021 6:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:46:09.305Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 28, 2021 6:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:46:34.211Z: Workers have started successfully.
    Aug 28, 2021 6:46:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:46:34.245Z: Workers have started successfully.
    Aug 28, 2021 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:47:03.694Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2021 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:47:03.837Z: Cleaning up.
    Aug 28, 2021 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:47:03.910Z: Stopping worker pool...
    Aug 28, 2021 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:49:35.027Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 28, 2021 6:49:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T18:49:35.066Z: Worker pool stopped.
    Aug 28, 2021 6:49:40 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-28_11_45_14-2462466162434520122 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e7b3ba27-38c1-4bf6-903c-94beb13a9b9e and timestamp: 2021-08-28T18:49:40.411000000Z:
                     Metric:                    Value:
                   read_time                    10.444
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2021 6:49:41 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 44.629 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 20s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/okhg45twhkfk4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2356

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2356/display/redirect>

Changes:


------------------------------------------
[...truncated 347.76 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 28, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 28, 2021 12:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 28, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 28, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash fd1c0e65310c091543a5bc1dfc466c3658b3d0ab8aed29a9a10f70f15f03d2e5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_RwOZTEMCRVDpbwd_EZsNliz0KuK7SmpoQ9w8V8D0uU.pb
    Aug 28, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 28, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 28, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4366236947174496636.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-xSfmrkdVQzJfV86sflxyTR3z6SAhy_4ka9m04pBkY2w.jar
    Aug 28, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 28, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 28, 2021 12:45:04 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 28, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 28, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 28, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 28, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 28, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-28_05_45_05-15312305636134796760?project=apache-beam-testing
    Aug 28, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-28_05_45_05-15312305636134796760
    Aug 28, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-28_05_45_05-15312305636134796760
    Aug 28, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-28T12:45:08.729Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 28, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:16.825Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 28, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:17.492Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 28, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:17.533Z: Expanding GroupByKey operations into optimizable parts.
    Aug 28, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:17.564Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 28, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:17.648Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 28, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:17.700Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 28, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:17.721Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 28, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:17.747Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 28, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:18.081Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:18.161Z: Starting 5 workers in us-central1-c...
    Aug 28, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:24.594Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 28, 2021 12:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:45:59.926Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 28, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:46:25.911Z: Workers have started successfully.
    Aug 28, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:46:25.948Z: Workers have started successfully.
    Aug 28, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:46:53.810Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:46:53.992Z: Cleaning up.
    Aug 28, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:46:54.076Z: Stopping worker pool...
    Aug 28, 2021 12:49:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:49:11.910Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 28, 2021 12:49:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T12:49:11.960Z: Worker pool stopped.
    Aug 28, 2021 12:49:19 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-28_05_45_05-15312305636134796760 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a13e395d-1da6-48e3-98ec-887c7ccb0793 and timestamp: 2021-08-28T12:49:19.890000000Z:
                     Metric:                    Value:
                   read_time                     8.587
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2021 12:49:20 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 31.796 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 2s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/2dr2jd3eteej6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2355

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2355/display/redirect>

Changes:


------------------------------------------
[...truncated 347.86 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 28, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 28, 2021 6:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 28, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 28, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash d70bf97a5458cc8bf85d77feef909a424c2484f2a7e6dfc5d89704475adae402> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-1wv5elRYzIv4XXf-75CaQkwkhPKn5t_F2JcER1ra5AI.pb
    Aug 28, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 28, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 28, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5457866242335079711.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-YGbKHHZ5QTKmDLcgP1Q0A-ZZLBcJDJIsSiK-Gb1vJQQ.jar
    Aug 28, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 28, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 28, 2021 6:45:04 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 28, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 28, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 28, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 28, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 28, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-27_23_45_05-10448866333211993289?project=apache-beam-testing
    Aug 28, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-27_23_45_05-10448866333211993289
    Aug 28, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-27_23_45_05-10448866333211993289
    Aug 28, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-28T06:45:08.312Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 28, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:45:16.220Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 28, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:45:16.993Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 28, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:45:17.022Z: Expanding GroupByKey operations into optimizable parts.
    Aug 28, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:45:17.058Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 28, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:45:17.134Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 28, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:45:17.168Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 28, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:45:17.200Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 28, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:45:17.238Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 28, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:45:17.542Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:45:17.816Z: Starting 5 workers in us-central1-c...
    Aug 28, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:45:41.179Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 28, 2021 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:46:08.252Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 28, 2021 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:46:33.800Z: Workers have started successfully.
    Aug 28, 2021 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:46:33.826Z: Workers have started successfully.
    Aug 28, 2021 6:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:47:08.735Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2021 6:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:47:08.888Z: Cleaning up.
    Aug 28, 2021 6:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:47:08.981Z: Stopping worker pool...
    Aug 28, 2021 6:49:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:49:28.697Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 28, 2021 6:49:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T06:49:28.734Z: Worker pool stopped.
    Aug 28, 2021 6:49:34 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-27_23_45_05-10448866333211993289 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f5d90799-762b-42fe-9af4-37883b59b981 and timestamp: 2021-08-28T06:49:34.722000000Z:
                     Metric:                    Value:
                   read_time                    11.642
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2021 6:49:35 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 46.467 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 16s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/myid2b2dl7452

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2354

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2354/display/redirect?page=changes>

Changes:

[randomstep] [BEAM-12706] Bump Apache Arrow to 5.0.0

[randomstep] [BEAM-12706] Bump Apache Arrow to 5.0.0


------------------------------------------
[...truncated 355.57 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2021 12:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2021 12:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2021 12:45:22 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 28, 2021 12:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 28, 2021 12:45:22 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 28, 2021 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 28, 2021 12:45:23 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 28, 2021 12:45:23 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 28, 2021 12:45:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 28, 2021 12:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 28, 2021 12:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash bee51b1b0c997aa1b7db7409390a06b115b6058d41e1d84aa2e8d04eaeb90fda> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vuUbGwyZeqG323QJOQoGsRW2BY1B4dhKoujQTq65D9o.pb
    Aug 28, 2021 12:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 28, 2021 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 28, 2021 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2649277089587684581.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ht2rjwezEnhrV8fVNiIlmVxkCopkIeW99wvFlCxPaNA.jar
    Aug 28, 2021 12:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 28, 2021 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 28, 2021 12:45:29 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 28, 2021 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 28, 2021 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 28, 2021 12:45:29 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 28, 2021 12:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 28, 2021 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-27_17_45_30-6589286990325717439?project=apache-beam-testing
    Aug 28, 2021 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-27_17_45_30-6589286990325717439
    Aug 28, 2021 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-27_17_45_30-6589286990325717439
    Aug 28, 2021 12:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-28T00:45:33.649Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 28, 2021 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:45:39.685Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 28, 2021 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:45:40.486Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 28, 2021 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:45:40.527Z: Expanding GroupByKey operations into optimizable parts.
    Aug 28, 2021 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:45:40.563Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 28, 2021 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:45:40.641Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 28, 2021 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:45:40.667Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 28, 2021 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:45:40.692Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 28, 2021 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:45:40.724Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 28, 2021 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:45:41.067Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2021 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:45:41.147Z: Starting 5 workers in us-central1-a...
    Aug 28, 2021 12:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:46:02.101Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 28, 2021 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:46:27.230Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 28, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:46:53.657Z: Workers have started successfully.
    Aug 28, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:46:53.688Z: Workers have started successfully.
    Aug 28, 2021 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:47:22.526Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 28, 2021 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:47:22.701Z: Cleaning up.
    Aug 28, 2021 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:47:22.772Z: Stopping worker pool...
    Aug 28, 2021 12:49:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:49:45.789Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 28, 2021 12:49:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-28T00:49:45.832Z: Worker pool stopped.
    Aug 28, 2021 12:49:51 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-27_17_45_30-6589286990325717439 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9a7cbe27-f543-49ef-a31d-3ed33b6dfc01 and timestamp: 2021-08-28T00:49:51.608000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.673

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 28, 2021 12:49:51 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 38.766 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 32s
152 actionable tasks: 102 executed, 50 from cache

Publishing build scan...
https://gradle.com/s/qqym2s4coke3g

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2353

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2353/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-12270] TPC-DS: Add schema projection for Parquet source

[noreply] [BEAM-12810] Reverting PR-15185 (#15402)


------------------------------------------
[...truncated 348.49 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1069634906]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2021 6:45:17 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2021 6:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 27, 2021 6:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 27, 2021 6:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 27, 2021 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 27, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash ce0ef7a11a07b7088daeabab80aeaafa1d5074e1ae593d5a5166d008f6551f5f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zg73oRoHtwiNrqurgK6q-h1QdOGuWT1aUWbQCPZVH18.pb
    Aug 27, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 27, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 27, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6796081154280946425.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-42oWrcM6dtQYQ-03hLKpUQKXp3eTPX4uxVk38moLcF4.jar
    Aug 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 27, 2021 6:45:26 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 27, 2021 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 27, 2021 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-27_11_45_26-18352251674677595447?project=apache-beam-testing
    Aug 27, 2021 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-27_11_45_26-18352251674677595447
    Aug 27, 2021 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-27_11_45_26-18352251674677595447
    Aug 27, 2021 6:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-27T18:45:30.414Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 27, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:45:37.987Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 27, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:45:38.627Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 27, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:45:38.658Z: Expanding GroupByKey operations into optimizable parts.
    Aug 27, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:45:38.701Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 27, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:45:38.771Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 27, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:45:38.806Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 27, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:45:38.841Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 27, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:45:38.866Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 27, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:45:39.209Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:45:39.290Z: Starting 5 workers in us-central1-c...
    Aug 27, 2021 6:46:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:46:05.172Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 27, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:46:24.400Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 27, 2021 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:46:49.439Z: Workers have started successfully.
    Aug 27, 2021 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:46:49.477Z: Workers have started successfully.
    Aug 27, 2021 6:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:47:30.666Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2021 6:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:47:30.826Z: Cleaning up.
    Aug 27, 2021 6:47:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:47:30.906Z: Stopping worker pool...
    Aug 27, 2021 6:49:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:49:46.666Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 27, 2021 6:49:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T18:49:46.712Z: Worker pool stopped.
    Aug 27, 2021 6:50:00 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-27_11_45_26-18352251674677595447 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): fcbb463d-3d34-4bb7-9b0c-4bc8adb78f78 and timestamp: 2021-08-27T18:50:00.312000000Z:
                     Metric:                    Value:
                   read_time                    15.997
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2021 6:50:00 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 55.344 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 34s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/rlgc6qvdugrcq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2352

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2352/display/redirect>

Changes:


------------------------------------------
[...truncated 347.73 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 27, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 27, 2021 12:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 27, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 27, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash fad8c24f7a5c41438de4512e4e3a16a0e106de99bf6ad9795d59b1fe77949a40> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--tjCT3pcQUON5FEuTjoWoOEG3pm_atl5XVmx_neUmkA.pb
    Aug 27, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 27, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 27, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7453673999759641655.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-nje4ck9BqC-9cW9-kv-OEnsgKvaObd4qdpLO-WxEiWU.jar
    Aug 27, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 27, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 27, 2021 12:45:04 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 27, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 27, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 27, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 27, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 27, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-27_05_45_04-17196895943486268096?project=apache-beam-testing
    Aug 27, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-27_05_45_04-17196895943486268096
    Aug 27, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-27_05_45_04-17196895943486268096
    Aug 27, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-27T12:45:08.337Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 27, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:45:14.087Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 27, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:45:14.746Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 27, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:45:14.803Z: Expanding GroupByKey operations into optimizable parts.
    Aug 27, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:45:14.836Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 27, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:45:14.950Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 27, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:45:15.007Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 27, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:45:15.058Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 27, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:45:15.092Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 27, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:45:15.471Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:45:15.563Z: Starting 5 workers in us-central1-a...
    Aug 27, 2021 12:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:45:33.435Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 27, 2021 12:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:46:05.759Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 27, 2021 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:46:32.338Z: Workers have started successfully.
    Aug 27, 2021 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:46:32.361Z: Workers have started successfully.
    Aug 27, 2021 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:47:01.291Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2021 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:47:01.454Z: Cleaning up.
    Aug 27, 2021 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:47:01.543Z: Stopping worker pool...
    Aug 27, 2021 12:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:49:22.769Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 27, 2021 12:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T12:49:22.813Z: Worker pool stopped.
    Aug 27, 2021 12:49:29 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-27_05_45_04-17196895943486268096 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): da3a126b-366c-4370-ae78-fcee682bbec8 and timestamp: 2021-08-27T12:49:29.074000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.101

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2021 12:49:29 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 40.537 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/4pprkfngh4z4o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2351

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2351/display/redirect?page=changes>

Changes:

[baeminbo] [BEAM-12751] Set clientRequestId for Dataflow python job creation

[Daniel Oliveira] [GoSDK Infra] Bugfix: Parallelism ignored when using endpoint flag.


------------------------------------------
[...truncated 350.39 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2021 6:45:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 27, 2021 6:45:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 27, 2021 6:45:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 27, 2021 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 27, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 29848d1bf4b8fac21e7714181615e7436c1639c649053e51196d6a93393d3492> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KYSNG_S4-sIedxQYFhXnQ2wWOcZJBT5RGW1qkzk9NJI.pb
    Aug 27, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 27, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 27, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6978499306285611130.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-dQAr0FdZpUT8A3zVkJFAn7E-XSspqqWQcoK0CKHeKk0.jar
    Aug 27, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 27, 2021 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 27, 2021 6:45:21 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 27, 2021 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 27, 2021 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 27, 2021 6:45:21 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 27, 2021 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 27, 2021 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-26_23_45_21-11428620118465056892?project=apache-beam-testing
    Aug 27, 2021 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-26_23_45_21-11428620118465056892
    Aug 27, 2021 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-26_23_45_21-11428620118465056892
    Aug 27, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-27T06:45:25.266Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 27, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:45:30.605Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 27, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:45:31.398Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 27, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:45:31.439Z: Expanding GroupByKey operations into optimizable parts.
    Aug 27, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:45:31.463Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 27, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:45:31.533Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 27, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:45:31.561Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 27, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:45:31.590Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 27, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:45:31.616Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 27, 2021 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:45:31.902Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2021 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:45:31.966Z: Starting 5 workers in us-central1-a...
    Aug 27, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:45:41.243Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 27, 2021 6:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:46:19.152Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 27, 2021 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:46:43.985Z: Workers have started successfully.
    Aug 27, 2021 6:46:45 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:46:44.017Z: Workers have started successfully.
    Aug 27, 2021 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:47:12.280Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2021 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:47:12.384Z: Cleaning up.
    Aug 27, 2021 6:47:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:47:12.445Z: Stopping worker pool...
    Aug 27, 2021 6:49:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:49:38.321Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 27, 2021 6:49:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T06:49:38.347Z: Worker pool stopped.
    Aug 27, 2021 6:49:43 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-26_23_45_21-11428620118465056892 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c3b21bf9-8bb8-45ad-88c7-6bed440bd072 and timestamp: 2021-08-27T06:49:43.575000000Z:
                     Metric:                    Value:
                   read_time                     8.789
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2021 6:49:43 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 39.484 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 26s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/mxi4gycdeyply

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2350

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2350/display/redirect?page=changes>

Changes:

[Ahmet Altay] Clean unused methods out of apiclient.py

[Ahmet Altay] fix

[Ahmet Altay] lint

[Ahmet Altay] fix

[Ankur Goenka] Add a blogpost for Apache Beam 2.32.0

[Ankur Goenka] Updating date for beam 2.32.0 blog post

[Ankur Goenka] Fixing author name

[Kyle Weaver] [BEAM-12320] Sickbay testSQLSelectsArrayAttributes.

[noreply] [BEAM-12742] SamzaTimerInternalsFactory#deleteTimer(TimerData) does not

[noreply] Merge pull request #15183 from [BEAM-11983] Java Datastore - Implement

[noreply] Merge pull request #15328 from [BEAM-11987] Python Datastore - Implement


------------------------------------------
[...truncated 361.30 KB...]
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2021 12:47:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2021 12:47:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2021 12:47:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 27, 2021 12:47:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 27, 2021 12:47:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 27, 2021 12:47:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 27, 2021 12:47:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 27, 2021 12:47:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 27, 2021 12:47:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 27, 2021 12:47:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 27, 2021 12:47:33 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 973acbfd79d244a663da1461a2cc349a79c25b99971181a4a1c67dc6138413bc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-lzrL_XnSRKZj2hRhosw0mnnCW5mXEYGkocZ9xhOEE7w.pb
    Aug 27, 2021 12:47:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 27, 2021 12:47:35 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-MwQ8hxc3iXtrqqAWhicqNo1fwz1sZE0j2JCI9U0JcoQ.jar
    Aug 27, 2021 12:47:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2106518100075848180.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-d0Mi8utLNmp-dYjdqZQAz-YMzXzo_aGZTRjl9Udto3o.jar
    Aug 27, 2021 12:47:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests-LpAeUO6biHrADNtPjoSYraUn6_5jhC8kASRP5yjUbSg.jar
    Aug 27, 2021 12:47:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 2 files newly uploaded in 0 seconds
    Aug 27, 2021 12:47:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 27, 2021 12:47:36 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 27, 2021 12:47:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 27, 2021 12:47:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 27, 2021 12:47:36 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 27, 2021 12:47:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 27, 2021 12:47:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-26_17_47_36-7441195195130317972?project=apache-beam-testing
    Aug 27, 2021 12:47:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-26_17_47_36-7441195195130317972
    Aug 27, 2021 12:47:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-26_17_47_36-7441195195130317972
    Aug 27, 2021 12:47:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-27T00:47:40.281Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 27, 2021 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:47:47.394Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 27, 2021 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:47:48.130Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 27, 2021 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:47:48.194Z: Expanding GroupByKey operations into optimizable parts.
    Aug 27, 2021 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:47:48.224Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 27, 2021 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:47:48.291Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 27, 2021 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:47:48.325Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 27, 2021 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:47:48.352Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 27, 2021 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:47:48.378Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 27, 2021 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:47:48.752Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2021 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:47:48.825Z: Starting 5 workers in us-central1-b...
    Aug 27, 2021 12:47:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:47:54.501Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 27, 2021 12:48:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:48:39.702Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 27, 2021 12:49:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:49:05.925Z: Workers have started successfully.
    Aug 27, 2021 12:49:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:49:05.954Z: Workers have started successfully.
    Aug 27, 2021 12:49:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:49:37.117Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 27, 2021 12:49:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:49:37.284Z: Cleaning up.
    Aug 27, 2021 12:49:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:49:37.357Z: Stopping worker pool...
    Aug 27, 2021 12:52:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:52:04.264Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 27, 2021 12:52:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-27T00:52:04.315Z: Worker pool stopped.
    Aug 27, 2021 12:52:10 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-26_17_47_36-7441195195130317972 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 16ed4a52-d9fa-43c1-bee9-b863b7ee6d82 and timestamp: 2021-08-27T00:52:10.312000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.863

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 27, 2021 12:52:10 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 5 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.039 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 51.849 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 52s
152 actionable tasks: 105 executed, 47 from cache

Publishing build scan...
https://gradle.com/s/tmfgyqxo3bmxq

Stopped 4 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2349

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2349/display/redirect>

Changes:


------------------------------------------
[...truncated 354.32 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2071944899]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2021 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2021 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2021 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2021 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2021 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2021 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2021 6:45:21 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 26, 2021 6:45:21 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 26, 2021 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 26, 2021 6:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 26, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 83ad0df9fa835dbf851a54bc60aef9bacb910f87b85bc52a0910a844c9f9d6e9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-g60N-fqDXb-FGlS8YK75usuRD4e4W8UqCRCoRMn51uk.pb
    Aug 26, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 26, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 26, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3603839950205337198.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ZSsmYVHV2agHlcmc6WB658IwCKJi6a2XCK3jculAlis.jar
    Aug 26, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 2 seconds
    Aug 26, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 26, 2021 6:45:30 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 26, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 26, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 26, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 26, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 26, 2021 6:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-26_11_45_30-13180695522471214331?project=apache-beam-testing
    Aug 26, 2021 6:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-26_11_45_30-13180695522471214331
    Aug 26, 2021 6:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-26_11_45_30-13180695522471214331
    Aug 26, 2021 6:45:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-26T18:45:34.106Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 26, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:45:39.896Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 26, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:45:40.660Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 26, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:45:40.701Z: Expanding GroupByKey operations into optimizable parts.
    Aug 26, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:45:40.728Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 26, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:45:40.802Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 26, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:45:40.837Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 26, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:45:40.863Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 26, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:45:40.891Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 26, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:45:41.207Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:45:41.283Z: Starting 5 workers in us-central1-a...
    Aug 26, 2021 6:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:46:13.424Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 26, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:46:23.121Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 26, 2021 6:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:46:50.116Z: Workers have started successfully.
    Aug 26, 2021 6:46:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:46:50.138Z: Workers have started successfully.
    Aug 26, 2021 6:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:47:21.702Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2021 6:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:47:21.828Z: Cleaning up.
    Aug 26, 2021 6:47:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:47:21.913Z: Stopping worker pool...
    Aug 26, 2021 6:49:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:49:43.809Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 26, 2021 6:49:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T18:49:43.851Z: Worker pool stopped.
    Aug 26, 2021 6:49:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-26_11_45_30-13180695522471214331 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d19ac6a9-0329-4e1b-882d-74f42cbe470f and timestamp: 2021-08-26T18:49:50.904000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.765

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2021 6:49:51 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 39.899 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 31s
152 actionable tasks: 102 executed, 50 from cache

Publishing build scan...
https://gradle.com/s/ljujkr2y3zqcs

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2348

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2348/display/redirect>

Changes:


------------------------------------------
[...truncated 348.05 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 26, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 26, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 26, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 26, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115846 bytes, hash 02eddde8dbd38703b70bc51efb81cf9ca52802e2cd2c7f66da836eed9aeba3ba> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Au3d6NvThwO3C8Ue-4HPnKUoAuLNLH9m2oNu7Zrro7o.pb
    Aug 26, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 26, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 26, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3063903957813969368.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-71xSN4Q8z5-yBlFgytvq_4MH_HRLRbcbbGTMWaRbP5M.jar
    Aug 26, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 26, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 26, 2021 12:45:07 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 26, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 26, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 26, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 26, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 26, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-26_05_45_08-7540574920900799189?project=apache-beam-testing
    Aug 26, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-26_05_45_08-7540574920900799189
    Aug 26, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-26_05_45_08-7540574920900799189
    Aug 26, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-26T12:45:13.812Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:18.887Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:19.570Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:19.594Z: Expanding GroupByKey operations into optimizable parts.
    Aug 26, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:19.621Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:19.686Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:19.713Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:19.750Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:19.777Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:20.078Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:20.155Z: Starting 5 workers in us-central1-a...
    Aug 26, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:46.312Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:50.047Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 26, 2021 12:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:45:50.079Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 26, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:46:00.486Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 26, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:46:25.332Z: Workers have started successfully.
    Aug 26, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:46:25.365Z: Workers have started successfully.
    Aug 26, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:46:54.403Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:46:54.534Z: Cleaning up.
    Aug 26, 2021 12:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:46:54.617Z: Stopping worker pool...
    Aug 26, 2021 12:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:49:20.572Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 26, 2021 12:49:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T12:49:20.611Z: Worker pool stopped.
    Aug 26, 2021 12:49:28 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-26_05_45_08-7540574920900799189 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 74c830ba-aece-403b-a9d3-4e345c5481e4 and timestamp: 2021-08-26T12:49:28.292000000Z:
                     Metric:                    Value:
                   read_time                     9.352
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2021 12:49:28 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 36.73 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 9s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/mcq55myc26yme

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2347

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2347/display/redirect?page=changes>

Changes:

[Ankur Goenka] Update Beam website to release 2.32.0

[noreply] Update website/www/site/content/en/get-started/downloads.md


------------------------------------------
[...truncated 353.05 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 26, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 26, 2021 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 26, 2021 6:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 26, 2021 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 82ebf83a9f8b78d185dd2c8ad5db2330c207ec921b6be881153fb1cf6bbe890b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-guv4Op-LeNGF3SyK1dsjMMIH7JIba-iBFT-xz2u-iQs.pb
    Aug 26, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 26, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 26, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test234351679301724061.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-AHMUZXB3s2MmR-n2Zo5iVFJABClBVlCIRkk6cLYX6Dg.jar
    Aug 26, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 26, 2021 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 26, 2021 6:45:27 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 26, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 26, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 26, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 26, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 26, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-25_23_45_28-13168113196856910036?project=apache-beam-testing
    Aug 26, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-25_23_45_28-13168113196856910036
    Aug 26, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-25_23_45_28-13168113196856910036
    Aug 26, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-26T06:45:31.426Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 26, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:45:37.729Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 26, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:45:38.389Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 26, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:45:38.419Z: Expanding GroupByKey operations into optimizable parts.
    Aug 26, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:45:38.447Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 26, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:45:38.520Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 26, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:45:38.538Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 26, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:45:38.572Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 26, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:45:38.607Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 26, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:45:38.957Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:45:39.026Z: Starting 5 workers in us-central1-b...
    Aug 26, 2021 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:45:59.019Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 26, 2021 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:46:24.367Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 26, 2021 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:46:50.038Z: Workers have started successfully.
    Aug 26, 2021 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:46:50.088Z: Workers have started successfully.
    Aug 26, 2021 6:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:47:24.475Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2021 6:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:47:24.614Z: Cleaning up.
    Aug 26, 2021 6:47:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:47:24.682Z: Stopping worker pool...
    Aug 26, 2021 6:49:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:49:49.261Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 26, 2021 6:49:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T06:49:49.306Z: Worker pool stopped.
    Aug 26, 2021 6:49:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-25_23_45_28-13168113196856910036 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 70328079-2be3-4794-966a-dfe58bc397cf and timestamp: 2021-08-26T06:49:56.271000000Z:
                     Metric:                    Value:
                   read_time                    11.358
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2021 6:49:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 44.907 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 36s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/xijiukqo2vih6

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2346

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2346/display/redirect?page=changes>

Changes:

[Udi Meiri] Moving to 2.34.0-SNAPSHOT on master branch.

[noreply] Add 2.34.0 section to CHANGES.md


------------------------------------------
[...truncated 353.84 KB...]
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2021 12:59:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2021 12:59:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 26, 2021 12:59:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 26, 2021 12:59:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 26, 2021 12:59:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 26, 2021 12:59:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 26, 2021 12:59:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 26, 2021 12:59:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 26, 2021 12:59:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 26, 2021 12:59:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash 1145e0da9a7bc45336ed1afe8ae152c245fd45d167be7cf65416a3face278899> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-EUXg2pp7xFM27Rr-iuFSwkX9RdFnvnz2VBaj-s4niJk.pb
    Aug 26, 2021 12:59:26 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 26, 2021 12:59:26 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 26, 2021 12:59:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-9snRLlhv_9RNHAtkdzEwt0C9GxyhH_eJndGOqadZfm4.jar
    Aug 26, 2021 12:59:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3079378259962521157.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-a3aMNTK1NBAhD3uw3M5bc9Bjj0z7_iePgTStWDMd2OU.jar
    Aug 26, 2021 12:59:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT-qsyth_X-2AynopJtGlEp0y9ZYWCE9nAZ-Mad3LL61Fw.jar
    Aug 26, 2021 12:59:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.34.0-SNAPSHOT-tests-z5Sh9dWo0DAeG5w5NaSrkooLfsJ26c7yIXDik9SzDm8.jar
    Aug 26, 2021 12:59:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.34.0-SNAPSHOT-tests-hPTh8ORAQSn0U3ZY9s7ryokdtEksosa2jOx4HFf7w2I.jar
    Aug 26, 2021 12:59:28 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/thrift/build/libs/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT-4bF8QpaxkWlkdPssX5OV0S9rIzFuRU0pojrN2fBwjjk.jar
    Aug 26, 2021 12:59:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/udf/build/libs/beam-sdks-java-extensions-sql-udf-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-udf-2.34.0-SNAPSHOT-MlQQAS1Je1cH4lXtIOKM3zL6o97S5F7sJQ3zUailw08.jar
    Aug 26, 2021 12:59:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/thrift/build/libs/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-thrift-2.34.0-SNAPSHOT-tests-I-osQrG-lSEES6RD4ibuovIeVl-_U3nOEEsBLo1rYrA.jar
    Aug 26, 2021 12:59:30 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.34.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.34.0-SNAPSHOT-xz-aSlhizJi8jigGo3amZ2g_MkglE3dBulicsi2UX5U.jar
    Aug 26, 2021 12:59:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 239 files cached, 9 files newly uploaded in 4 seconds
    Aug 26, 2021 12:59:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 26, 2021 12:59:31 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 26, 2021 12:59:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 26, 2021 12:59:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 26, 2021 12:59:31 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 26, 2021 12:59:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.34.0-SNAPSHOT
    Aug 26, 2021 12:59:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-25_17_59_32-10884909075673196231?project=apache-beam-testing
    Aug 26, 2021 12:59:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-25_17_59_32-10884909075673196231
    Aug 26, 2021 12:59:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-25_17_59_32-10884909075673196231
    Aug 26, 2021 12:59:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-26T00:59:36.234Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 26, 2021 12:59:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T00:59:42.276Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 26, 2021 12:59:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T00:59:42.991Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 26, 2021 12:59:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T00:59:43.035Z: Expanding GroupByKey operations into optimizable parts.
    Aug 26, 2021 12:59:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T00:59:43.070Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 26, 2021 12:59:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T00:59:43.153Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 26, 2021 12:59:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T00:59:43.190Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 26, 2021 12:59:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T00:59:43.215Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 26, 2021 12:59:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T00:59:43.252Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 26, 2021 12:59:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T00:59:43.712Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2021 12:59:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T00:59:43.798Z: Starting 5 workers in us-central1-b...
    Aug 26, 2021 1:00:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T01:00:03.419Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 26, 2021 1:00:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T01:00:35.819Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 26, 2021 1:01:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T01:01:04.718Z: Workers have started successfully.
    Aug 26, 2021 1:01:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T01:01:04.761Z: Workers have started successfully.
    Aug 26, 2021 1:01:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T01:01:41.987Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 26, 2021 1:01:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T01:01:42.140Z: Cleaning up.
    Aug 26, 2021 1:01:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T01:01:42.219Z: Stopping worker pool...
    Aug 26, 2021 1:04:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T01:04:00.901Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 26, 2021 1:04:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-26T01:04:00.970Z: Worker pool stopped.
    Aug 26, 2021 1:04:08 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-25_17_59_32-10884909075673196231 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): fa27ab8b-1aed-4f2e-a3f8-3bfa0a1100d8 and timestamp: 2021-08-26T01:04:08.657000000Z:
                     Metric:                    Value:
                   read_time                     12.62
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 26, 2021 1:04:09 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.041 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 5 mins 19.655 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 46s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/bp6h5hpawcfec

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2345

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2345/display/redirect?page=changes>

Changes:

[Jan Lukavský] [BEAM-12704] Failing test for Flink primitive Read

[Pablo Estrada] Informative error for BigDecimal conversion in JdbcIO.

[Kyle Weaver] [BEAM-12764] Revert "Merge pull request #15165 from [BEAM-12593] Verify

[Kyle Weaver] [BEAM-12733] Sickbay RecommendationAICatalogItemIT.createCatalogItem

[Kyle Weaver] [BEAM-12683] Sickbay RecommendationAIIT.test_create_catalog_item

[Kyle Weaver] [BEAM-12733] Sickbay RecommendationAIPredictIT.predict

[Jan Lukavský] [BEAM-12704] Primitive Read working on Flink

[noreply] [BEAM-12778] Prevent unnecessary dry run requests to BQ (#15356)


------------------------------------------
[...truncated 348.87 KB...]
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 25, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 25, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 25, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 25, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash 3fb1a33ac3c1ba4b9ea4e41ef654a6a4417c2cb6c380662070d64c606c9016fd> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-P7GjOsPBukuepOQe9lSmpEF8LLbDgGYgcNZMYGyQFv0.pb
    Aug 25, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 25, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 25, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4806572445680475863.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-N4UeUaUytCOI-0l0UO1EjiHpQE5Q6aeti4tsBor0nOA.jar
    Aug 25, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.12.7/1f45c6a397feeb46da75425619333d1cc6f90f78/mongo-java-driver-3.12.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.12.7-D_zgBJWNb9mzmuetJ37a0X9XtpcfSGsXYpxe6eE8Tao.jar
    Aug 25, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 2 files newly uploaded in 0 seconds
    Aug 25, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 25, 2021 6:45:10 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 25, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 25, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 25, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 25, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 25, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-25_11_45_10-2768591111111158900?project=apache-beam-testing
    Aug 25, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-25_11_45_10-2768591111111158900
    Aug 25, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-25_11_45_10-2768591111111158900
    Aug 25, 2021 6:45:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-25T18:45:14.048Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 25, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:45:20.993Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 25, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:45:21.786Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 25, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:45:21.834Z: Expanding GroupByKey operations into optimizable parts.
    Aug 25, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:45:21.877Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 25, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:45:21.962Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 25, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:45:21.989Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 25, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:45:22.020Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 25, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:45:22.056Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 25, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:45:22.446Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:45:22.544Z: Starting 5 workers in us-central1-b...
    Aug 25, 2021 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:45:31.801Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 25, 2021 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:46:15.306Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 25, 2021 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:46:45.605Z: Workers have started successfully.
    Aug 25, 2021 6:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:46:45.637Z: Workers have started successfully.
    Aug 25, 2021 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:47:15.278Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2021 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:47:15.439Z: Cleaning up.
    Aug 25, 2021 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:47:15.536Z: Stopping worker pool...
    Aug 25, 2021 6:49:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:49:43.592Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 25, 2021 6:49:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T18:49:43.637Z: Worker pool stopped.
    Aug 25, 2021 6:49:50 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-25_11_45_10-2768591111111158900 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): eb3ef48c-de83-4b19-9120-5855be61808a and timestamp: 2021-08-25T18:49:50.162000000Z:
                     Metric:                    Value:
                   read_time                     7.533
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2021 6:49:50 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 57.055 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 28s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/pquglbjblsfci

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2344

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2344/display/redirect>

Changes:


------------------------------------------
[...truncated 346.67 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2021 12:44:44 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2021 12:44:44 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 25, 2021 12:44:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 25, 2021 12:44:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 25, 2021 12:44:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 25, 2021 12:44:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115850 bytes, hash fbf45e55e6dc68eb0829dc39b5bdfe8b68526efd01dbd2259989a0665969e61e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--_ReVebcaOsIKdw5tb3-i2hSbv0B29IlmYmgZllp5h4.pb
    Aug 25, 2021 12:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 25, 2021 12:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7162026089083881833.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-PPy62EclH7QSABVYYZDAD2ptyZj3RtFELmSpIS5YW-s.jar
    Aug 25, 2021 12:44:52 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 25, 2021 12:44:53 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 25, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 25, 2021 12:44:53 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 25, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 25, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 25, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 25, 2021 12:44:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 25, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-25_05_44_53-2731005621199917695?project=apache-beam-testing
    Aug 25, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-25_05_44_53-2731005621199917695
    Aug 25, 2021 12:44:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-25_05_44_53-2731005621199917695
    Aug 25, 2021 12:44:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-25T12:44:57.222Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 25, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:03.973Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 25, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:04.712Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 25, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:04.754Z: Expanding GroupByKey operations into optimizable parts.
    Aug 25, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:04.795Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 25, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:04.861Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 25, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:04.897Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 25, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:04.924Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 25, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:04.952Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 25, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:05.290Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:05.384Z: Starting 5 workers in us-central1-b...
    Aug 25, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:11.546Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 25, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:45:51.249Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 25, 2021 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:46:17.235Z: Workers have started successfully.
    Aug 25, 2021 12:46:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:46:17.263Z: Workers have started successfully.
    Aug 25, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:46:47.965Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:46:48.099Z: Cleaning up.
    Aug 25, 2021 12:46:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:46:48.181Z: Stopping worker pool...
    Aug 25, 2021 12:49:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:49:13.891Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 25, 2021 12:49:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T12:49:13.939Z: Worker pool stopped.
    Aug 25, 2021 12:49:21 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-25_05_44_53-2731005621199917695 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): bf0a21d6-ceee-4d21-9702-0d9019eb62c6 and timestamp: 2021-08-25T12:49:21.453000000Z:
                     Metric:                    Value:
                   read_time                     9.195
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2021 12:49:21 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 24 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.001 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 46.737 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 3s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/tqufzhso7xjkg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2343

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2343/display/redirect?page=changes>

Changes:

[Robert Burke] [BEAM-9615] Report Row as Known URN


------------------------------------------
[...truncated 346.60 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2021 6:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2021 6:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2021 6:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2021 6:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2021 6:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2021 6:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2021 6:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 25, 2021 6:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 25, 2021 6:44:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 25, 2021 6:44:48 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 25, 2021 6:44:49 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115851 bytes, hash 209a58f464c0cadf84dd8a5d3264582aed5e31bc6b6cded52e5348c650dd5e2f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-IJpY9GTAyt-E3YpdMmRYKu1eMbxrbN7VLlNIxlDdXi8.pb
    Aug 25, 2021 6:44:51 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 25, 2021 6:44:51 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 25, 2021 6:44:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7823649943976909898.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Hji4KVpcCe3-0B4V-QrjFk-TFYTWGf-GuAXZGwd3tx4.jar
    Aug 25, 2021 6:44:51 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 25, 2021 6:44:51 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 25, 2021 6:44:51 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 25, 2021 6:44:52 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 25, 2021 6:44:52 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 25, 2021 6:44:52 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 25, 2021 6:44:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 25, 2021 6:44:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-24_23_44_52-5596656985805478731?project=apache-beam-testing
    Aug 25, 2021 6:44:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-24_23_44_52-5596656985805478731
    Aug 25, 2021 6:44:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-24_23_44_52-5596656985805478731
    Aug 25, 2021 6:44:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-25T06:44:55.681Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 25, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:01.695Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 25, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:02.471Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 25, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:02.511Z: Expanding GroupByKey operations into optimizable parts.
    Aug 25, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:02.542Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 25, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:02.607Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 25, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:02.633Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 25, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:02.655Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 25, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:02.677Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 25, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:02.976Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:03.052Z: Starting 5 workers in us-central1-b...
    Aug 25, 2021 6:45:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:35.672Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 25, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:45:43.866Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 25, 2021 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:46:10.658Z: Workers have started successfully.
    Aug 25, 2021 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:46:10.685Z: Workers have started successfully.
    Aug 25, 2021 6:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:46:40.663Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2021 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:46:40.800Z: Cleaning up.
    Aug 25, 2021 6:46:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:46:40.870Z: Stopping worker pool...
    Aug 25, 2021 6:49:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:49:07.483Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 25, 2021 6:49:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T06:49:07.511Z: Worker pool stopped.
    Aug 25, 2021 6:49:14 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-24_23_44_52-5596656985805478731 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 49d18115-97e7-44ab-93de-ca33663e4373 and timestamp: 2021-08-25T06:49:14.194000000Z:
                     Metric:                    Value:
                   read_time                     9.381
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2021 6:49:14 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 21 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.003 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 38.639 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 55s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/7wn4houvwhs7u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2342

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2342/display/redirect?page=changes>

Changes:

[ihor.indyk] Decreasing peak memory usage for

[ihor.indyk] Changing the default `merge_accumulators_batch_size`

[noreply] Clarify PCollection immutability. (#15227)

[noreply] [BEAM-11218] ptest allows to obtain a pipeline result (#15364)

[Kyle Weaver] Remove trailing whitespace

[noreply] Merge pull request #15357 from [BEAM-12781] Add memoization of


------------------------------------------
[...truncated 346.26 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 25, 2021 12:44:45 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 25, 2021 12:44:46 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 25, 2021 12:44:46 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 25, 2021 12:44:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 25, 2021 12:44:49 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 3b6590e294df1357abaf446c9db675861d94f8e656f85814eddf5f929610b88b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-O2WQ4pTfE1err0RsnbZ1hh2U-OZW-FgU7d9fkpYQuIs.pb
    Aug 25, 2021 12:44:51 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 25, 2021 12:44:51 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 25, 2021 12:44:51 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5339246499804845474.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-dy7inAS7n6MZSVVTU3MdYcwdFszj45RKcfb69a44ju8.jar
    Aug 25, 2021 12:44:52 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 25, 2021 12:44:52 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 25, 2021 12:44:52 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 25, 2021 12:44:52 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 25, 2021 12:44:52 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 25, 2021 12:44:52 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 25, 2021 12:44:52 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 25, 2021 12:44:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-24_17_44_52-8910469216472133896?project=apache-beam-testing
    Aug 25, 2021 12:44:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-24_17_44_52-8910469216472133896
    Aug 25, 2021 12:44:53 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-24_17_44_52-8910469216472133896
    Aug 25, 2021 12:44:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-25T00:44:56.091Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 25, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:04.687Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 25, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:05.558Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 25, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:05.623Z: Expanding GroupByKey operations into optimizable parts.
    Aug 25, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:05.660Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 25, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:05.754Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 25, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:05.786Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 25, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:05.820Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 25, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:05.842Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 25, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:06.186Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:06.254Z: Starting 5 workers in us-central1-b...
    Aug 25, 2021 12:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:22.577Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 25, 2021 12:45:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:45:51.653Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 25, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:46:20.747Z: Workers have started successfully.
    Aug 25, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:46:20.774Z: Workers have started successfully.
    Aug 25, 2021 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:46:51.160Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 25, 2021 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:46:51.295Z: Cleaning up.
    Aug 25, 2021 12:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:46:51.410Z: Stopping worker pool...
    Aug 25, 2021 12:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:49:15.214Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 25, 2021 12:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-25T00:49:15.255Z: Worker pool stopped.
    Aug 25, 2021 12:49:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-24_17_44_52-8910469216472133896 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d047f196-0d93-4ae2-83f3-ffdcdbeea219 and timestamp: 2021-08-25T00:49:20.538000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.093

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 25, 2021 12:49:20 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 906 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.005 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.004 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 44.881 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 2s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/d2hjkw5f3vvq6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2341

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2341/display/redirect?page=changes>

Changes:

[noreply] [Go SDK] Go SDK Exits Experimental (#15374)

[noreply] [BEAM-12724][BEAM-12349] Support user timers in Samza portable runner


------------------------------------------
[...truncated 347.53 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 24, 2021 6:45:03 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 24, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 24, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 24, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115849 bytes, hash 5d357a21aca46d5e6ad7850c329848b0aa3704d5e1c89fce6b01383d60c3cb7d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-XTV6IaykbV5q14UMMphIsKo3BNXhyJ_OawE4PWDDy30.pb
    Aug 24, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 24, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 24, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8622247746424991869.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ZEEVUNj0HjFWkYeExuHOiNYhB6ziVhVkTwZIUhN_bBw.jar
    Aug 24, 2021 6:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 24, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 24, 2021 6:45:10 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 24, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 24, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 24, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 24, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 24, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-24_11_45_11-947508423808894357?project=apache-beam-testing
    Aug 24, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-24_11_45_11-947508423808894357
    Aug 24, 2021 6:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-24_11_45_11-947508423808894357
    Aug 24, 2021 6:45:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-24T18:45:14.361Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 24, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:45:21.827Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 24, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:45:22.433Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 24, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:45:22.470Z: Expanding GroupByKey operations into optimizable parts.
    Aug 24, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:45:22.510Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 24, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:45:22.580Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 24, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:45:22.604Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 24, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:45:22.637Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 24, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:45:22.672Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 24, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:45:23.048Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:45:23.155Z: Starting 5 workers in us-central1-c...
    Aug 24, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:45:37.619Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 24, 2021 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:46:07.868Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 24, 2021 6:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:46:36.400Z: Workers have started successfully.
    Aug 24, 2021 6:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:46:36.433Z: Workers have started successfully.
    Aug 24, 2021 6:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:47:11.900Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2021 6:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:47:12.050Z: Cleaning up.
    Aug 24, 2021 6:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:47:12.115Z: Stopping worker pool...
    Aug 24, 2021 6:49:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:49:37.248Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 24, 2021 6:49:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T18:49:37.297Z: Worker pool stopped.
    Aug 24, 2021 6:49:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-24_11_45_11-947508423808894357 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 698ce4cd-d819-4375-ad61-300d34410f09 and timestamp: 2021-08-24T18:49:46.137000000Z:
                     Metric:                    Value:
                   read_time                    10.472
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2021 6:49:46 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 54.388 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 28s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/b7r6nworvtafg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2340

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2340/display/redirect>

Changes:


------------------------------------------
[...truncated 348.49 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 24, 2021 12:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 24, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 24, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 24, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 77ab5a75f897fd51b24a77238d2eab236ed527b90d930f390bd9238f1c5cf62c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-d6tadfiX_VGySncjjS6rI27VJ7kNkw85C9kjjxxc9iw.pb
    Aug 24, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 24, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7562348084643247515.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-7w5ZcGIZ73ps57vzlcwBcfXftGbIPQsErdTdkcJedSU.jar
    Aug 24, 2021 12:45:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 24, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 1 seconds
    Aug 24, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 24, 2021 12:45:11 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 24, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 24, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 24, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 24, 2021 12:45:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 24, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-24_05_45_11-8370176166124880556?project=apache-beam-testing
    Aug 24, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-24_05_45_11-8370176166124880556
    Aug 24, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-24_05_45_11-8370176166124880556
    Aug 24, 2021 12:45:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-24T12:45:15.257Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 24, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:45:22.101Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 24, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:45:22.766Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 24, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:45:22.805Z: Expanding GroupByKey operations into optimizable parts.
    Aug 24, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:45:22.832Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 24, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:45:22.907Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 24, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:45:22.942Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 24, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:45:22.994Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 24, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:45:23.025Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 24, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:45:23.342Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:45:23.420Z: Starting 5 workers in us-central1-a...
    Aug 24, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:45:46.057Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 24, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:46:14.041Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 24, 2021 12:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:46:40.404Z: Workers have started successfully.
    Aug 24, 2021 12:46:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:46:40.433Z: Workers have started successfully.
    Aug 24, 2021 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:47:10.840Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2021 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:47:10.981Z: Cleaning up.
    Aug 24, 2021 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:47:11.061Z: Stopping worker pool...
    Aug 24, 2021 12:49:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:49:33.197Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 24, 2021 12:49:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T12:49:33.244Z: Worker pool stopped.
    Aug 24, 2021 12:49:40 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-24_05_45_11-8370176166124880556 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 00b9ddb7-131d-43a2-80d9-76d61c867697 and timestamp: 2021-08-24T12:49:40.512000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.354

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2021 12:49:40 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 48.743 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 21s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/xyxt63uzoc5i4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2339

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2339/display/redirect>

Changes:


------------------------------------------
[...truncated 347.25 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 24, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 24, 2021 6:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 24, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 24, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115849 bytes, hash e573b968a120762c5e0ded2fa1c39683afbd788979270b362cd3233063bbdb5d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-5XO5aKEgdixeDe0vocOWg6-9eIl5Jws2LNMjMGO7210.pb
    Aug 24, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 24, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 24, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test90574163896039833.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Nd6Tp-3h3mmL5zQrGJ3oJe9LQ0PNPXn6a3mGVhVVT0o.jar
    Aug 24, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 24, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 24, 2021 6:45:06 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 24, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 24, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 24, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 24, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 24, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-23_23_45_06-12024042760390752771?project=apache-beam-testing
    Aug 24, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-23_23_45_06-12024042760390752771
    Aug 24, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-23_23_45_06-12024042760390752771
    Aug 24, 2021 6:45:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-24T06:45:10.294Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 24, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:45:18.352Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 24, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:45:19.135Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 24, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:45:19.182Z: Expanding GroupByKey operations into optimizable parts.
    Aug 24, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:45:19.217Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 24, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:45:19.292Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 24, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:45:19.331Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 24, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:45:19.354Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 24, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:45:19.385Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 24, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:45:19.761Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2021 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:45:19.838Z: Starting 5 workers in us-central1-a...
    Aug 24, 2021 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:45:41.680Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 24, 2021 6:46:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:46:08.731Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 24, 2021 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:46:34.142Z: Workers have started successfully.
    Aug 24, 2021 6:46:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:46:34.178Z: Workers have started successfully.
    Aug 24, 2021 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:47:03.162Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2021 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:47:03.322Z: Cleaning up.
    Aug 24, 2021 6:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:47:03.435Z: Stopping worker pool...
    Aug 24, 2021 6:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:49:24.480Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 24, 2021 6:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T06:49:24.530Z: Worker pool stopped.
    Aug 24, 2021 6:49:29 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-23_23_45_06-12024042760390752771 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f9a56bea-86e3-46a9-a6dd-56315b2c8f9e and timestamp: 2021-08-24T06:49:29.976000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.055

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2021 6:49:30 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 40.294 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/e3qu7cwwshy3i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2338

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2338/display/redirect?page=changes>

Changes:

[Andrew Pilloud] [BEAM-9379] Remove Java11 specific SQL timeout

[noreply] [BEAM-6374] Elide collecting unnecessary pcollection metrics (#15358)


------------------------------------------
[...truncated 348.39 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 24, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 24, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 24, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 24, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 24, 2021 12:45:02 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 24, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 24, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 24, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash f5d290594afad0e5d185c403384bdaecfebbecb2f037409effe0f47402f089e3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9dKQWUr60OXRhcQDOEva7P677LLwN0Ce_-D0dALwieM.pb
    Aug 24, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 24, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 24, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2074455636289994950.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-_gBpWKci_N4q3g0ZRj7VE_dF0R8GyhcN5TBCHx7urw4.jar
    Aug 24, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 24, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 24, 2021 12:45:10 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 24, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 24, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 24, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 24, 2021 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 24, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-23_17_45_10-4110959396966204691?project=apache-beam-testing
    Aug 24, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-23_17_45_10-4110959396966204691
    Aug 24, 2021 12:45:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-23_17_45_10-4110959396966204691
    Aug 24, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-24T00:45:14.076Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 24, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:45:20.669Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 24, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:45:21.290Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 24, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:45:21.332Z: Expanding GroupByKey operations into optimizable parts.
    Aug 24, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:45:21.368Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 24, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:45:21.440Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 24, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:45:21.490Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 24, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:45:21.527Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 24, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:45:21.551Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 24, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:45:21.883Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:45:21.959Z: Starting 5 workers in us-central1-a...
    Aug 24, 2021 12:45:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:45:33.988Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 24, 2021 12:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:46:05.713Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 24, 2021 12:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:46:38.436Z: Workers have started successfully.
    Aug 24, 2021 12:46:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:46:38.488Z: Workers have started successfully.
    Aug 24, 2021 12:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:47:04.609Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 24, 2021 12:47:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:47:04.790Z: Cleaning up.
    Aug 24, 2021 12:47:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:47:04.879Z: Stopping worker pool...
    Aug 24, 2021 12:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:49:25.245Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 24, 2021 12:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-24T00:49:25.298Z: Worker pool stopped.
    Aug 24, 2021 12:49:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-23_17_45_10-4110959396966204691 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c0f81c88-4175-4c98-ad11-3bc387b804de and timestamp: 2021-08-24T00:49:32.144000000Z:
                     Metric:                    Value:
                   read_time                     7.223
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 24, 2021 12:49:32 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 40.79 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 13s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/ykni7griszpxu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2337

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2337/display/redirect?page=changes>

Changes:

[noreply] [BEAM-8571] Release Go SDK containers, and use them. (#15365)


------------------------------------------
[...truncated 346.85 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 23, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 23, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 23, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 23, 2021 6:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 23, 2021 6:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 23, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 87e1d49a1d9327e453dff5fc21f8ef430f8d9b145f8932ccb5552dadc5d6130c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-h-HUmh2TJ-RT3_X8IfjvQw-NmxRfiTLMtVUtrcXWEww.pb
    Aug 23, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 23, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 23, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8832978331143602159.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Hy5IStGjMkcxpZlGHjcepkzJ8LsjiD825Kjyz5Cea2A.jar
    Aug 23, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 1 seconds
    Aug 23, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 23, 2021 6:45:06 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 23, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 23, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 23, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 23, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 23, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-23_11_45_07-8390002786789764599?project=apache-beam-testing
    Aug 23, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-23_11_45_07-8390002786789764599
    Aug 23, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-23_11_45_07-8390002786789764599
    Aug 23, 2021 6:45:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-23T18:45:10.709Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 23, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:45:17.702Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 23, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:45:18.349Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 23, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:45:18.387Z: Expanding GroupByKey operations into optimizable parts.
    Aug 23, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:45:18.418Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 23, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:45:18.488Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 23, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:45:18.519Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 23, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:45:18.553Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 23, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:45:18.591Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 23, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:45:19.002Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:45:19.075Z: Starting 5 workers in us-central1-a...
    Aug 23, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:45:22.279Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 23, 2021 6:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:46:08.308Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 23, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:46:34.209Z: Workers have started successfully.
    Aug 23, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:46:34.240Z: Workers have started successfully.
    Aug 23, 2021 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:47:15.006Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2021 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:47:15.153Z: Cleaning up.
    Aug 23, 2021 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:47:15.224Z: Stopping worker pool...
    Aug 23, 2021 6:49:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:49:39.412Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 23, 2021 6:49:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T18:49:39.471Z: Worker pool stopped.
    Aug 23, 2021 6:49:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-23_11_45_07-8390002786789764599 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 5a09ddc6-4b88-49d0-bf1c-3a089c137d2b and timestamp: 2021-08-23T18:49:45.295000000Z:
                     Metric:                    Value:
                   read_time                    12.744
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2021 6:49:45 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 56.314 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 28s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/4wmbsb5kgl43c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2336

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2336/display/redirect>

Changes:


------------------------------------------
[...truncated 347.48 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 23, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 23, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 23, 2021 12:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 23, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 23, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 23, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash 529d3a567ef9a5adcce5e48f1a300fc4aa37a6ff2fb8b2b8afcf1f31d35fef1b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Up06Vn75pa3M5eSPGjAPxKo3pv8vuLK4r88fMdNf7xs.pb
    Aug 23, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 23, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 23, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test534913085253234040.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-P0IhQXa6XKFWc6qM69PKkE3L-VjawIWQbWgT-mW_qWA.jar
    Aug 23, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 23, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 23, 2021 12:45:06 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 23, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 23, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 23, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 23, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 23, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-23_05_45_07-880701081066478945?project=apache-beam-testing
    Aug 23, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-23_05_45_07-880701081066478945
    Aug 23, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-23_05_45_07-880701081066478945
    Aug 23, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-23T12:45:10.949Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 23, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:45:16.384Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 23, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:45:17.019Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 23, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:45:17.046Z: Expanding GroupByKey operations into optimizable parts.
    Aug 23, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:45:17.075Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 23, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:45:17.152Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 23, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:45:17.179Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 23, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:45:17.213Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 23, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:45:17.247Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 23, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:45:17.576Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:45:17.653Z: Starting 5 workers in us-central1-c...
    Aug 23, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:45:28.007Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 23, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:46:01.636Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 23, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:46:28.750Z: Workers have started successfully.
    Aug 23, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:46:28.777Z: Workers have started successfully.
    Aug 23, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:47:02.021Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:47:02.183Z: Cleaning up.
    Aug 23, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:47:02.251Z: Stopping worker pool...
    Aug 23, 2021 12:49:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:49:20.927Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 23, 2021 12:49:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T12:49:20.970Z: Worker pool stopped.
    Aug 23, 2021 12:49:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-23_05_45_07-880701081066478945 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d9570b90-f0fd-422d-9c0a-2948986e8649 and timestamp: 2021-08-23T12:49:27.522000000Z:
                     Metric:                    Value:
                   read_time                    11.019
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2021 12:49:27 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 37.145 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 7s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/n4smsro3vmzv6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2335

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2335/display/redirect>

Changes:


------------------------------------------
[...truncated 347.17 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 23, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 23, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 23, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 23, 2021 6:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 23, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 23, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115850 bytes, hash d89537da9169570854dff52e38f93a7d27f48ea5249671501433d7de30be3a52> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-2JU32pFpVwhU3_UuOPk6fSf0jqUklnFQFDPX3jC-OlI.pb
    Aug 23, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 23, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 23, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4594992340031266113.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lBV-JKPWhghoRGcerUd7D37beWHdjk9gbjgk_41S4vY.jar
    Aug 23, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 23, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 23, 2021 6:45:05 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 23, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 23, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 23, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 23, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 23, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-22_23_45_05-4945051276705116756?project=apache-beam-testing
    Aug 23, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-22_23_45_05-4945051276705116756
    Aug 23, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-22_23_45_05-4945051276705116756
    Aug 23, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-23T06:45:09.031Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 23, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:16.334Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 23, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:17.078Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 23, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:17.119Z: Expanding GroupByKey operations into optimizable parts.
    Aug 23, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:17.156Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 23, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:17.226Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 23, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:17.250Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 23, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:17.284Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 23, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:17.316Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 23, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:17.638Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:17.706Z: Starting 5 workers in us-central1-a...
    Aug 23, 2021 6:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:33.804Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 23, 2021 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:45:59.754Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 23, 2021 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:46:24.782Z: Workers have started successfully.
    Aug 23, 2021 6:46:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:46:24.801Z: Workers have started successfully.
    Aug 23, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:46:55.912Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2021 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:46:56.062Z: Cleaning up.
    Aug 23, 2021 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:46:56.131Z: Stopping worker pool...
    Aug 23, 2021 6:49:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:49:15.473Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 23, 2021 6:49:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T06:49:15.520Z: Worker pool stopped.
    Aug 23, 2021 6:49:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-22_23_45_05-4945051276705116756 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1f8851f7-cd17-4e94-b34c-e28cd6d4186f and timestamp: 2021-08-23T06:49:20.719000000Z:
                     Metric:                    Value:
                   read_time                    11.386
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2021 6:49:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 31.591 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 1s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/75asxot5ukjak

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2334

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2334/display/redirect>

Changes:


------------------------------------------
[...truncated 349.71 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 23, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 23, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 23, 2021 12:45:06 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 23, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 23, 2021 12:45:07 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 23, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 23, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 23, 2021 12:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash d304db1bff81137b2a3123b402520475b4cb00c96dcdbcaa14d8fb1852e1802b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0wTbG_-BE3sqMSO0AlIEdbTLAMltzbyqFNj7GFLhgCs.pb
    Aug 23, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 23, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6981420953870452952.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lWa4GK1uz1JVFs4DlnWTHEJc7EF2jKU4N_ZhxmVerGU.jar
    Aug 23, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 23, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 23, 2021 12:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 23, 2021 12:45:15 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 23, 2021 12:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 23, 2021 12:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 23, 2021 12:45:15 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 23, 2021 12:45:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 23, 2021 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-22_17_45_15-5832868591656669552?project=apache-beam-testing
    Aug 23, 2021 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-22_17_45_15-5832868591656669552
    Aug 23, 2021 12:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-22_17_45_15-5832868591656669552
    Aug 23, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-23T00:45:19.199Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 23, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:45:26.159Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 23, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:45:26.891Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 23, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:45:26.932Z: Expanding GroupByKey operations into optimizable parts.
    Aug 23, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:45:26.964Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 23, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:45:27.038Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 23, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:45:27.073Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 23, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:45:27.104Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 23, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:45:27.127Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 23, 2021 12:45:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:45:27.574Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2021 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:45:27.652Z: Starting 5 workers in us-central1-a...
    Aug 23, 2021 12:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:45:41.513Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 23, 2021 12:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:46:01.947Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Aug 23, 2021 12:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:46:01.980Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Aug 23, 2021 12:46:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:46:12.303Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 23, 2021 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:46:35.986Z: Workers have started successfully.
    Aug 23, 2021 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:46:36.040Z: Workers have started successfully.
    Aug 23, 2021 12:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:47:03.119Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 23, 2021 12:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:47:03.259Z: Cleaning up.
    Aug 23, 2021 12:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:47:03.344Z: Stopping worker pool...
    Aug 23, 2021 12:49:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:49:33.882Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 23, 2021 12:49:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-23T00:49:33.927Z: Worker pool stopped.
    Aug 23, 2021 12:49:40 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-22_17_45_15-5832868591656669552 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7f0e9431-0e4b-49b9-9471-b795495a8960 and timestamp: 2021-08-23T00:49:40.559000000Z:
                     Metric:                    Value:
                   read_time                     8.693
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 23, 2021 12:49:41 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 9 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.005 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.006 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 44.389 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 23s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/etm722jcxvvnu

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2333

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2333/display/redirect>

Changes:


------------------------------------------
[...truncated 346.87 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 22, 2021 6:45:02 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 22, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 22, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 22, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115910 bytes, hash 84c471f5d46b0db1a1c001efe2c90e10477d9b0e70efaeb7f9eefc9ee4770dcb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hMRx9dRrDbGhwAHv4skOEEd9mw5w7663-e78nuR3Dcs.pb
    Aug 22, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 22, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 22, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test840912480544654269.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Zd9ESk1nfDJxgSfR6X-OFiOY7CSCb81aJNGgmaUD3nU.jar
    Aug 22, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 22, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 22, 2021 6:45:08 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 22, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 22, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 22, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 22, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 22, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-22_11_45_09-14032351991430978495?project=apache-beam-testing
    Aug 22, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-22_11_45_09-14032351991430978495
    Aug 22, 2021 6:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-22_11_45_09-14032351991430978495
    Aug 22, 2021 6:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-22T18:45:12.831Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 22, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:45:19.586Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 22, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:45:20.426Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 22, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:45:20.459Z: Expanding GroupByKey operations into optimizable parts.
    Aug 22, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:45:20.477Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 22, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:45:20.559Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 22, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:45:20.592Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 22, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:45:20.624Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 22, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:45:20.647Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 22, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:45:20.968Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2021 6:45:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:45:21.036Z: Starting 5 workers in us-central1-a...
    Aug 22, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:45:32.415Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 22, 2021 6:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:46:05.965Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 22, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:46:32.076Z: Workers have started successfully.
    Aug 22, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:46:32.104Z: Workers have started successfully.
    Aug 22, 2021 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:47:01.752Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2021 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:47:01.900Z: Cleaning up.
    Aug 22, 2021 6:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:47:01.982Z: Stopping worker pool...
    Aug 22, 2021 6:49:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:49:20.552Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 22, 2021 6:49:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T18:49:20.594Z: Worker pool stopped.
    Aug 22, 2021 6:49:25 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-22_11_45_09-14032351991430978495 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3a571ed7-d338-4094-9ebb-43aac972ea62 and timestamp: 2021-08-22T18:49:25.775000000Z:
                     Metric:                    Value:
                   read_time                     8.698
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2021 6:49:26 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 33.716 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 7s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/4m35x47irhsya

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2332

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2332/display/redirect>

Changes:


------------------------------------------
[...truncated 363.02 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2021 5:05:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2021 5:05:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2021 5:05:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2021 5:05:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2021 5:05:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2021 5:05:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2021 5:05:51 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 22, 2021 5:05:51 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 22, 2021 5:05:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 22, 2021 5:05:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 22, 2021 5:05:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115849 bytes, hash 1d5dd9e452c8a9d38de4d2063c14964d9c2d53a8a31316d5fbb93960fd9c4756> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-HV3Z5FLIqdON5NIGPBSWTZwtU6ijExbV-7k5YP2cR1Y.pb
    Aug 22, 2021 5:05:58 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 22, 2021 5:05:58 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 22, 2021 5:05:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8882100940941637786.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-BpnF92f3r_2xVBwmO4pIan7GxsHHx5UsV1KvyaP1pNw.jar
    Aug 22, 2021 5:05:59 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 22, 2021 5:05:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 22, 2021 5:05:59 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 22, 2021 5:05:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 22, 2021 5:05:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 22, 2021 5:05:59 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 22, 2021 5:05:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 22, 2021 5:06:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-22_10_05_59-13464271286398948887?project=apache-beam-testing
    Aug 22, 2021 5:06:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-22_10_05_59-13464271286398948887
    Aug 22, 2021 5:06:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-22_10_05_59-13464271286398948887
    Aug 22, 2021 5:06:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-22T17:06:03.073Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 22, 2021 5:06:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:10.427Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 22, 2021 5:06:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:11.263Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 22, 2021 5:06:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:11.305Z: Expanding GroupByKey operations into optimizable parts.
    Aug 22, 2021 5:06:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:11.335Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 22, 2021 5:06:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:11.408Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 22, 2021 5:06:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:11.463Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 22, 2021 5:06:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:11.493Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 22, 2021 5:06:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:11.534Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 22, 2021 5:06:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:11.873Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2021 5:06:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:11.946Z: Starting 5 workers in us-central1-a...
    Aug 22, 2021 5:06:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:22.053Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 22, 2021 5:06:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:06:58.176Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 22, 2021 5:07:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:07:24.014Z: Workers have started successfully.
    Aug 22, 2021 5:07:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:07:24.043Z: Workers have started successfully.
    Aug 22, 2021 5:07:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:07:53.134Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2021 5:07:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:07:53.274Z: Cleaning up.
    Aug 22, 2021 5:07:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:07:53.347Z: Stopping worker pool...
    Aug 22, 2021 5:10:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:10:18.666Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 22, 2021 5:10:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T17:10:18.711Z: Worker pool stopped.
    Aug 22, 2021 5:10:23 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-22_10_05_59-13464271286398948887 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6826683d-ec54-4947-b056-076498f3ca06 and timestamp: 2021-08-22T17:10:24Z:
                     Metric:                    Value:
                   read_time                     9.492
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2021 5:10:24 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 43.26 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 1s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/dftibhvlqgvvw

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2331

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2331/display/redirect>

Changes:


------------------------------------------
[...truncated 347.52 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 22, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 22, 2021 6:44:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 22, 2021 6:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 22, 2021 6:44:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash a1d384581e1557c3e1c87a2b8be513baec366b68cdffdb3b0a0abe294902e12b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-odOEWB4VV8PhyHori-UTuuw2a2jN_9s7Cgq-KUkC4Ss.pb
    Aug 22, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 22, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2652429298273514904.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ZIdNL1nZzeV9n8p1wxep22AIMQCMLLDjhu7MpRSbh_c.jar
    Aug 22, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 22, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 22, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 22, 2021 6:45:01 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 22, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 22, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 22, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 22, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 22, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-21_23_45_01-12378118859852012271?project=apache-beam-testing
    Aug 22, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-21_23_45_01-12378118859852012271
    Aug 22, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-21_23_45_01-12378118859852012271
    Aug 22, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-22T06:45:05.090Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 22, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:45:11.853Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 22, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:45:12.611Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 22, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:45:12.650Z: Expanding GroupByKey operations into optimizable parts.
    Aug 22, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:45:12.675Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 22, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:45:12.746Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 22, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:45:12.775Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 22, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:45:12.800Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 22, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:45:12.822Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 22, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:45:13.116Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:45:13.191Z: Starting 5 workers in us-central1-a...
    Aug 22, 2021 6:45:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:45:30.563Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 22, 2021 6:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:46:04.861Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 22, 2021 6:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:46:29.895Z: Workers have started successfully.
    Aug 22, 2021 6:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:46:29.954Z: Workers have started successfully.
    Aug 22, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:46:57.729Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:46:57.846Z: Cleaning up.
    Aug 22, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:46:57.906Z: Stopping worker pool...
    Aug 22, 2021 6:49:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:49:25.063Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 22, 2021 6:49:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T06:49:25.920Z: Worker pool stopped.
    Aug 22, 2021 6:49:33 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-21_23_45_01-12378118859852012271 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 79eeb334-0b3c-4f23-a9bc-6035fede14b2 and timestamp: 2021-08-22T06:49:33.566000000Z:
                     Metric:                    Value:
                   read_time                      8.96
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2021 6:49:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 48.06 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 16s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/33f5frwaxt5wu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2330

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2330/display/redirect>

Changes:


------------------------------------------
[...truncated 349.54 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 22, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 22, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 22, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 22, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 22, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 22, 2021 12:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 22, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 22, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 4bff70f360ef3b7f1b93f344e2ed7a2d1e066dcc1b51a0a32003013489199d3f> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-S_9w82DvO38bk_NE4u16LR4GbcwbUaCjIAMBNIkZnT8.pb
    Aug 22, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 22, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2843793599287009302.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-3j_bj0lkxJ5p2V1f8muU2VUettmPLeKoA6_2XZ4yED4.jar
    Aug 22, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 22, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 22, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 22, 2021 12:45:07 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 22, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 22, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 22, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 22, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 22, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-21_17_45_07-10588708069122675507?project=apache-beam-testing
    Aug 22, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-21_17_45_07-10588708069122675507
    Aug 22, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-21_17_45_07-10588708069122675507
    Aug 22, 2021 12:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-22T00:45:11.242Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 22, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:18.739Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 22, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:19.455Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 22, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:19.484Z: Expanding GroupByKey operations into optimizable parts.
    Aug 22, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:19.508Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 22, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:19.573Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 22, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:19.599Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 22, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:19.619Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 22, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:19.645Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 22, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:19.944Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:20.007Z: Starting 5 workers in us-central1-a...
    Aug 22, 2021 12:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:23.761Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 22, 2021 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:56.239Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Aug 22, 2021 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:45:56.265Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Aug 22, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:46:06.590Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 22, 2021 12:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:46:30.505Z: Workers have started successfully.
    Aug 22, 2021 12:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:46:30.546Z: Workers have started successfully.
    Aug 22, 2021 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:46:56.630Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 22, 2021 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:46:56.764Z: Cleaning up.
    Aug 22, 2021 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:46:56.834Z: Stopping worker pool...
    Aug 22, 2021 12:49:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:49:18.572Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 22, 2021 12:49:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-22T00:49:18.610Z: Worker pool stopped.
    Aug 22, 2021 12:49:24 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-21_17_45_07-10588708069122675507 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9bde47aa-4797-4f0d-8d65-ac7c84892273 and timestamp: 2021-08-22T00:49:24.987000000Z:
                     Metric:                    Value:
                   read_time                     6.915
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 22, 2021 12:49:25 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 34.358 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 7s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/blf2cge4glhrw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2329

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2329/display/redirect>

Changes:


------------------------------------------
[...truncated 346.54 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 21, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 21, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 21, 2021 6:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 21, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 2798069675b4f0179d7272c694612bc09604a571ec488efc7e657bb7aa630054> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-J5gGlnW08BedcnLGlGErwJYEpXHsSI78fmV7t6pjAFQ.pb
    Aug 21, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 21, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 21, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9130551235243542490.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Tsxn7sopw3b13Ld8iD1KY_haAAme7GWpIl0O6nCFj8g.jar
    Aug 21, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 21, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 21, 2021 6:45:03 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 21, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 21, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 21, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 21, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 21, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-21_11_45_03-12454831965506161028?project=apache-beam-testing
    Aug 21, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-21_11_45_03-12454831965506161028
    Aug 21, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-21_11_45_03-12454831965506161028
    Aug 21, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-21T18:45:07.169Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 21, 2021 6:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:45:14.510Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 21, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:45:15.276Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 21, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:45:15.311Z: Expanding GroupByKey operations into optimizable parts.
    Aug 21, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:45:15.337Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 21, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:45:15.383Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 21, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:45:15.421Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 21, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:45:15.443Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 21, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:45:15.466Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 21, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:45:15.776Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:45:15.849Z: Starting 5 workers in us-central1-a...
    Aug 21, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:45:23.601Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 21, 2021 6:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:46:00.508Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 21, 2021 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:46:27.623Z: Workers have started successfully.
    Aug 21, 2021 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:46:27.648Z: Workers have started successfully.
    Aug 21, 2021 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:46:59.171Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2021 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:46:59.337Z: Cleaning up.
    Aug 21, 2021 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:46:59.410Z: Stopping worker pool...
    Aug 21, 2021 6:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:49:15.173Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 21, 2021 6:49:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T18:49:15.214Z: Worker pool stopped.
    Aug 21, 2021 6:49:22 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-21_11_45_03-12454831965506161028 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 2de58102-4232-4b0d-b406-e9b09e209258 and timestamp: 2021-08-21T18:49:22.282000000Z:
                     Metric:                    Value:
                   read_time                     9.451
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2021 6:49:22 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 35.509 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 5s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/xkcu3fafoh7cy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2328

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2328/display/redirect>

Changes:


------------------------------------------
[...truncated 348.74 KB...]
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 21, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 21, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 21, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 21, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115910 bytes, hash a6005b6aadd3395342ec27d0864a7f7743c85a38dd0694f3f705bd198d91dbb6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pgBbaq3TOVNC7CfQhkp_d0PIWjjdBpTz9wW9GY2R27Y.pb
    Aug 21, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 21, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 21, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3200546391937717375.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-RGGNZTF1sl9vAGNYY9OCYE95BMc1asIAIbs78VmsfRw.jar
    Aug 21, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Aug 21, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Aug 21, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
    Aug 21, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Aug 21, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 243 files cached, 5 files newly uploaded in 0 seconds
    Aug 21, 2021 12:45:07 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 21, 2021 12:45:07 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 21, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 21, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 21, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 21, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 21, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-21_05_45_08-15365953321694350773?project=apache-beam-testing
    Aug 21, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-21_05_45_08-15365953321694350773
    Aug 21, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-21_05_45_08-15365953321694350773
    Aug 21, 2021 12:45:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-21T12:45:11.952Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:45:18.338Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:45:19.095Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:45:19.129Z: Expanding GroupByKey operations into optimizable parts.
    Aug 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:45:19.171Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:45:19.266Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:45:19.297Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:45:19.329Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:45:19.375Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:45:19.710Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:45:19.778Z: Starting 5 workers in us-central1-a...
    Aug 21, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:45:29.736Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 21, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:46:06.054Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 21, 2021 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:46:32.117Z: Workers have started successfully.
    Aug 21, 2021 12:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:46:32.161Z: Workers have started successfully.
    Aug 21, 2021 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:46:59.778Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2021 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:46:59.917Z: Cleaning up.
    Aug 21, 2021 12:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:46:59.979Z: Stopping worker pool...
    Aug 21, 2021 12:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:49:27.302Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 21, 2021 12:49:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T12:49:27.344Z: Worker pool stopped.
    Aug 21, 2021 12:49:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-21_05_45_08-15365953321694350773 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 82526cc8-398a-434d-873a-7a5f0ad013bf and timestamp: 2021-08-21T12:49:34.346000000Z:
                     Metric:                    Value:
                   read_time                      7.19
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2021 12:49:34 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 43.018 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 14s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/qhzd7t6hu6nuo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2327

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2327/display/redirect?page=changes>

Changes:

[noreply] Avoid spamming lull logging (#15366)


------------------------------------------
[...truncated 347.93 KB...]
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 21, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 21, 2021 6:44:56 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 21, 2021 6:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 21, 2021 6:44:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115905 bytes, hash f6ef538819e4f289d343c6aaabc8ab847e95913e38eacd86485afb3e2bbb76eb> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9u9TiBnk8onTQ8aqq8irhH6VkT446s2GSFr7Piu7dus.pb
    Aug 21, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 21, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2042072408910196008.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-SZe3GKR5bjzyYFk4y6s4yy8U7AfHu8PqF87Tm8VxTws.jar
    Aug 21, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 21, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.33.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.33.0-SNAPSHOT-qsyth_X-2AynopJtGlEp0y9ZYWCE9nAZ-Mad3LL61Fw.jar
    Aug 21, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/thrift/build/libs/beam-sdks-java-io-thrift-2.33.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-thrift-2.33.0-SNAPSHOT-4bF8QpaxkWlkdPssX5OV0S9rIzFuRU0pojrN2fBwjjk.jar
    Aug 21, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.33.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.33.0-SNAPSHOT-xz-aSlhizJi8jigGo3amZ2g_MkglE3dBulicsi2UX5U.jar
    Aug 21, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/udf/build/libs/beam-sdks-java-extensions-sql-udf-2.33.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-udf-2.33.0-SNAPSHOT-MlQQAS1Je1cH4lXtIOKM3zL6o97S5F7sJQ3zUailw08.jar
    Aug 21, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 243 files cached, 5 files newly uploaded in 0 seconds
    Aug 21, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 21, 2021 6:45:02 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 21, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 21, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 21, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 21, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 21, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-20_23_45_02-6978565694525606586?project=apache-beam-testing
    Aug 21, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-20_23_45_02-6978565694525606586
    Aug 21, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-20_23_45_02-6978565694525606586
    Aug 21, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-21T06:45:06.252Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 21, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:12.257Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 21, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:12.957Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 21, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:13.000Z: Expanding GroupByKey operations into optimizable parts.
    Aug 21, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:13.037Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 21, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:13.111Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 21, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:13.159Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 21, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:13.179Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 21, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:13.221Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 21, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:13.581Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:13.655Z: Starting 5 workers in us-central1-a...
    Aug 21, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:18.764Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 21, 2021 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:45:56.913Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 21, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:46:23.295Z: Workers have started successfully.
    Aug 21, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:46:23.323Z: Workers have started successfully.
    Aug 21, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:46:52.888Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:46:53.027Z: Cleaning up.
    Aug 21, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:46:53.105Z: Stopping worker pool...
    Aug 21, 2021 6:49:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:49:10.697Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 21, 2021 6:49:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T06:49:10.731Z: Worker pool stopped.
    Aug 21, 2021 6:49:17 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-20_23_45_02-6978565694525606586 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 09472f8c-d507-4a0d-bb02-2097b2ce8d0f and timestamp: 2021-08-21T06:49:17.700000000Z:
                     Metric:                    Value:
                   read_time                     9.494
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2021 6:49:18 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 31.195 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 0s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/vmirfhpoioy3m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2326

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2326/display/redirect?page=changes>

Changes:

[zhoufek] [BEAM-9487] Disable allowing unsafe triggers by default

[noreply] [BEAM-10917] Add support for BigQuery Read API in Python BEAM (#15185)

[ryanthompson591] Change filter to also retry on 408 errors.


------------------------------------------
[...truncated 347.88 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2071944899]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 21, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 21, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 21, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 21, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 21, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 21, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 21, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 21, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 571880e7ad862c55d6eadfd583bd0bdfd9364b5932e19513c6421427607b0832> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-VxiA562GLFXW6t_Vg70L39k2S1ky4ZUTxkIUJ2B7CDI.pb
    Aug 21, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 21, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 21, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8544077078188900373.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Kfg8ZErK4oE8hT-R22_sK60CXFx82tX5TDZKidyO1hQ.jar
    Aug 21, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 21, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 21, 2021 12:45:06 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 21, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 21, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 21, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 21, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 21, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-20_17_45_06-13135085669233769361?project=apache-beam-testing
    Aug 21, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-20_17_45_06-13135085669233769361
    Aug 21, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-20_17_45_06-13135085669233769361
    Aug 21, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-21T00:45:10.759Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 21, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:45:17.275Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 21, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:45:17.855Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 21, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:45:17.893Z: Expanding GroupByKey operations into optimizable parts.
    Aug 21, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:45:17.926Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 21, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:45:17.994Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 21, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:45:18.029Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 21, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:45:18.080Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 21, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:45:18.104Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 21, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:45:18.421Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:45:18.500Z: Starting 5 workers in us-central1-a...
    Aug 21, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:45:37.843Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 21, 2021 12:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:46:01.586Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 21, 2021 12:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:46:30.335Z: Workers have started successfully.
    Aug 21, 2021 12:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:46:30.367Z: Workers have started successfully.
    Aug 21, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:47:00.337Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 21, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:47:00.491Z: Cleaning up.
    Aug 21, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:47:00.591Z: Stopping worker pool...
    Aug 21, 2021 12:49:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:49:17.083Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 21, 2021 12:49:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-21T00:49:17.124Z: Worker pool stopped.
    Aug 21, 2021 12:49:23 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-20_17_45_06-13135085669233769361 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dbb477cf-c9a9-4930-afb2-74293e900817 and timestamp: 2021-08-21T00:49:23.959000000Z:
                     Metric:                    Value:
                   read_time                     9.697
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 21, 2021 12:49:24 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 34.988 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 5s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/l3fndcximnw2k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2325

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2325/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-12780] StreamingDataflowWorker should only retry exceptions


------------------------------------------
[...truncated 351.76 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2021 6:45:31 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2021 6:45:32 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 20, 2021 6:45:32 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 20, 2021 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 20, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 20, 2021 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115846 bytes, hash 7c2d884c6319a541e599c8b6f42062a895210b7c8761d2f346e56c1d6f929075> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-fC2ITGMZpUHlmci29CBiqJUhC3yHYdLzRuVsHW-SkHU.pb
    Aug 20, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 20, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-PXF-PpZwr5bQHHf7lFUSqHyObVD2F_y-Eu68Da75nwU.jar
    Aug 20, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5163182414557254175.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-hdgr3eunGOa8A6qmvlHwn7gdiKSeq1KbWNzrZGrY9_M.jar
    Aug 20, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.m2/repository/org/apache/beam/beam-vendor-grpc-1_36_0/0.2/beam-vendor-grpc-1_36_0-0.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-grpc-1_36_0-0.2-4uvN0OifZ8dVz8dFmmyiW5EFIqrlxVJpZTIHDxwD0EU.jar
    Aug 20, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.69/6cb063f1d527ff65bdbb9ea74888a5ffc3f92197/fastjson-1.2.69.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.69-KniRdEoDeOAJmpuB3b7U0uGn6GMvb4b4_8-jg-UeScg.jar
    Aug 20, 2021 6:45:40 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 245 files cached, 3 files newly uploaded in 1 seconds
    Aug 20, 2021 6:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 20, 2021 6:45:40 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 20, 2021 6:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 20, 2021 6:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 20, 2021 6:45:40 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 20, 2021 6:45:40 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 20, 2021 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-20_11_45_40-9972541951485561019?project=apache-beam-testing
    Aug 20, 2021 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-20_11_45_40-9972541951485561019
    Aug 20, 2021 6:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-20_11_45_40-9972541951485561019
    Aug 20, 2021 6:45:46 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-20T18:45:44.150Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 20, 2021 6:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:45:51.592Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 20, 2021 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:45:52.226Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 20, 2021 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:45:52.257Z: Expanding GroupByKey operations into optimizable parts.
    Aug 20, 2021 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:45:52.286Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 20, 2021 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:45:52.348Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 20, 2021 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:45:52.375Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 20, 2021 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:45:52.407Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 20, 2021 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:45:52.434Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 20, 2021 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:45:52.781Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2021 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:45:52.868Z: Starting 5 workers in us-central1-c...
    Aug 20, 2021 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:46:24.363Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 20, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:46:37.514Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 20, 2021 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:47:03.210Z: Workers have started successfully.
    Aug 20, 2021 6:47:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:47:03.234Z: Workers have started successfully.
    Aug 20, 2021 6:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:47:35.981Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2021 6:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:47:36.162Z: Cleaning up.
    Aug 20, 2021 6:47:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:47:36.244Z: Stopping worker pool...
    Aug 20, 2021 6:49:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:49:54.336Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 20, 2021 6:49:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T18:49:54.403Z: Worker pool stopped.
    Aug 20, 2021 6:50:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-20_11_45_40-9972541951485561019 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 763130f0-edab-44d9-993c-3cafa77ebaea and timestamp: 2021-08-20T18:50:01.894000000Z:
                     Metric:                    Value:
                   read_time                    10.143
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2021 6:50:02 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 40.938 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 43s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/muehqn44cov3c

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2324

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2324/display/redirect>

Changes:


------------------------------------------
[...truncated 347.56 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 20, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 20, 2021 12:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 20, 2021 12:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 20, 2021 12:44:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash d1ccbb26429bff6e1dd0f71de412e776a4d7c6f080fbc2e2a6109d29d3c64757> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0cy7JkKb_24d0Pcd5BLndqTXxvCA-8LiphCdKdPGR1c.pb
    Aug 20, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 20, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-B31ygN6Eq3qQlhTqdhDrjh2y-wPKKM8uOVdjtRbbUqk.jar
    Aug 20, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4085171995071063160.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-AdJqD8CNA69o4uNBJlrYHmo8Y8FqvEODqGd83Q2lpMk.jar
    Aug 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 20, 2021 12:45:02 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 20, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 20, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-20_05_45_02-4216812862713000085?project=apache-beam-testing
    Aug 20, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-20_05_45_02-4216812862713000085
    Aug 20, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-20_05_45_02-4216812862713000085
    Aug 20, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-20T12:45:05.947Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 20, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:11.879Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 20, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:12.641Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 20, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:12.684Z: Expanding GroupByKey operations into optimizable parts.
    Aug 20, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:12.711Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 20, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:12.769Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 20, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:12.807Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 20, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:12.838Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 20, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:12.870Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 20, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:13.203Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:13.283Z: Starting 5 workers in us-central1-c...
    Aug 20, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:41.178Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 20, 2021 12:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:45:52.141Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 20, 2021 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:46:21.598Z: Workers have started successfully.
    Aug 20, 2021 12:46:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:46:21.631Z: Workers have started successfully.
    Aug 20, 2021 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:46:58.126Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2021 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:46:58.257Z: Cleaning up.
    Aug 20, 2021 12:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:46:58.338Z: Stopping worker pool...
    Aug 20, 2021 12:49:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:49:23.395Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 20, 2021 12:49:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T12:49:23.423Z: Worker pool stopped.
    Aug 20, 2021 12:49:29 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-20_05_45_02-4216812862713000085 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e63cbd43-0c88-4d62-ac9f-00f5bdb0ca5c and timestamp: 2021-08-20T12:49:29.635000000Z:
                     Metric:                    Value:
                   read_time                    12.151
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2021 12:49:29 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 43.057 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 11s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/w7innjk3i7ckk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2323

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2323/display/redirect>

Changes:


------------------------------------------
[...truncated 346.37 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 20, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 20, 2021 6:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 20, 2021 6:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 20, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash ed3604b9df17d967e5a48f10bf802e2ee89c873b4d1d9ac96e9d4acd1f02714e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7TYEud8X2WflpI8Qv4AuLuichztNHZrJbp1KzR8CcU4.pb
    Aug 20, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 20, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2498293363682070999.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-aS1JV-99-0WlDbGsSfKvG3d2kbQTTivpN-O6WpHUtAU.jar
    Aug 20, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-B31ygN6Eq3qQlhTqdhDrjh2y-wPKKM8uOVdjtRbbUqk.jar
    Aug 20, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 20, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 20, 2021 6:45:03 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 20, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 20, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 20, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 20, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 20, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-19_23_45_03-1289373414364405415?project=apache-beam-testing
    Aug 20, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-19_23_45_03-1289373414364405415
    Aug 20, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-19_23_45_03-1289373414364405415
    Aug 20, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-20T06:45:06.859Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 20, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:12.843Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 20, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:13.717Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 20, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:13.757Z: Expanding GroupByKey operations into optimizable parts.
    Aug 20, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:13.794Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 20, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:13.882Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 20, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:13.909Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 20, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:13.942Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 20, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:13.974Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 20, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:14.369Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:14.451Z: Starting 5 workers in us-central1-c...
    Aug 20, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:25.637Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 20, 2021 6:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:45:57.521Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 20, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:46:23.131Z: Workers have started successfully.
    Aug 20, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:46:23.167Z: Workers have started successfully.
    Aug 20, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:46:55.046Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:46:55.200Z: Cleaning up.
    Aug 20, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:46:55.318Z: Stopping worker pool...
    Aug 20, 2021 6:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:49:15.182Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 20, 2021 6:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T06:49:15.220Z: Worker pool stopped.
    Aug 20, 2021 6:49:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-19_23_45_03-1289373414364405415 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): dd311021-4fee-435c-822f-bb47bc16847e and timestamp: 2021-08-20T06:49:20.859000000Z:
                     Metric:                    Value:
                   read_time                      9.37
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2021 6:49:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 33.737 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 2s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/sz5aobzbro2us

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2322

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2322/display/redirect?page=changes>

Changes:

[noreply] [BEAM-3304] Go triggering support (#15239)


------------------------------------------
[...truncated 359.33 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2141376916]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2021 12:46:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2021 12:46:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2021 12:46:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 20, 2021 12:46:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2021 12:46:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2021 12:46:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2021 12:46:16 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 20, 2021 12:46:16 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 20, 2021 12:46:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 20, 2021 12:46:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 20, 2021 12:46:20 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115910 bytes, hash bd8b6e57d3ae7dc6627c49f48f2fe06fcd2f7bc01ca75074278c07066ffd72b4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vYtuV9OufcZifEn0jy_gb80ve8Acp1B0J4wHBm_9crQ.pb
    Aug 20, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 20, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-B31ygN6Eq3qQlhTqdhDrjh2y-wPKKM8uOVdjtRbbUqk.jar
    Aug 20, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2780867783473614260.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ReJHm8zUudrIJgQLmyE8praCkbY5-cOfGQtGaCFwW6A.jar
    Aug 20, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 20, 2021 12:46:23 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 20, 2021 12:46:23 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 20, 2021 12:46:23 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 20, 2021 12:46:23 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 20, 2021 12:46:23 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 20, 2021 12:46:23 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 20, 2021 12:46:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-19_17_46_23-17207303953140708108?project=apache-beam-testing
    Aug 20, 2021 12:46:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-19_17_46_23-17207303953140708108
    Aug 20, 2021 12:46:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-19_17_46_23-17207303953140708108
    Aug 20, 2021 12:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-20T00:46:27.136Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 20, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:46:38.051Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 20, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:46:38.897Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 20, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:46:38.943Z: Expanding GroupByKey operations into optimizable parts.
    Aug 20, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:46:38.970Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 20, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:46:39.044Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 20, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:46:39.078Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 20, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:46:39.126Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 20, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:46:39.170Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 20, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:46:39.595Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:46:39.660Z: Starting 5 workers in us-central1-c...
    Aug 20, 2021 12:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:46:54.868Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 20, 2021 12:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:47:23.492Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 20, 2021 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:47:51.833Z: Workers have started successfully.
    Aug 20, 2021 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:47:51.859Z: Workers have started successfully.
    Aug 20, 2021 12:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:48:23.379Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 20, 2021 12:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:48:23.550Z: Cleaning up.
    Aug 20, 2021 12:48:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:48:23.658Z: Stopping worker pool...
    Aug 20, 2021 12:50:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:50:41.746Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 20, 2021 12:50:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-20T00:50:41.812Z: Worker pool stopped.
    Aug 20, 2021 12:50:49 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-19_17_46_23-17207303953140708108 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4c665a45-6072-4e99-8acd-617eb17f100d and timestamp: 2021-08-20T00:50:49.228000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.415

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2021 12:50:49 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 42.397 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 29s
152 actionable tasks: 104 executed, 48 from cache

Publishing build scan...
https://gradle.com/s/bgofy3lnam6bu

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2321

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2321/display/redirect?page=changes>

Changes:

[Steve Niemitz] [BEAM-12754] Only call getValue once per field per row

[ryanthompson591] Removes test_bad_path

[noreply] Fix broken BQ Integration Test  (#15352)

[noreply] Change conflicting StateReader name from side to reader (#15348)


------------------------------------------
[...truncated 348.24 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 19, 2021 6:45:00 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 19, 2021 6:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 19, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 19, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 2ed311e5ee938819b6a81b6196e7d5fdf799cb6846c3a7bc33f0516dc4b9ef64> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LtMR5e6TiBm2qBthlufV_feZy2hGw6e8M_BRbcS572Q.pb
    Aug 19, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 19, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-B31ygN6Eq3qQlhTqdhDrjh2y-wPKKM8uOVdjtRbbUqk.jar
    Aug 19, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.33.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.33.0-SNAPSHOT-tests-z5Sh9dWo0DAeG5w5NaSrkooLfsJ26c7yIXDik9SzDm8.jar
    Aug 19, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7105486771250006767.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-NVxGFGJg9o2aFcQk59WNhc8-Y0cStcXKEzqQkuXeR7U.jar
    Aug 19, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/thrift/build/libs/beam-sdks-java-io-thrift-2.33.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-thrift-2.33.0-SNAPSHOT-tests-I-osQrG-lSEES6RD4ibuovIeVl-_U3nOEEsBLo1rYrA.jar
    Aug 19, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 245 files cached, 3 files newly uploaded in 0 seconds
    Aug 19, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 19, 2021 6:45:08 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 19, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 19, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 19, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 19, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 19, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-19_11_45_09-7949507579967404592?project=apache-beam-testing
    Aug 19, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-19_11_45_09-7949507579967404592
    Aug 19, 2021 6:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-19_11_45_09-7949507579967404592
    Aug 19, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-19T18:45:14.425Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 19, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:45:22.218Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 19, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:45:22.919Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 19, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:45:22.966Z: Expanding GroupByKey operations into optimizable parts.
    Aug 19, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:45:22.994Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 19, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:45:23.058Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 19, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:45:23.093Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 19, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:45:23.126Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 19, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:45:23.163Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 19, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:45:23.561Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2021 6:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:45:23.644Z: Starting 5 workers in us-central1-c...
    Aug 19, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:45:56.018Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 19, 2021 6:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:46:09.857Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 19, 2021 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:46:34.924Z: Workers have started successfully.
    Aug 19, 2021 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:46:34.949Z: Workers have started successfully.
    Aug 19, 2021 6:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:47:05.449Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2021 6:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:47:05.575Z: Cleaning up.
    Aug 19, 2021 6:47:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:47:05.648Z: Stopping worker pool...
    Aug 19, 2021 6:49:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:49:22.182Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 19, 2021 6:49:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T18:49:22.240Z: Worker pool stopped.
    Aug 19, 2021 6:49:28 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-19_11_45_09-7949507579967404592 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 035fb0b4-e187-43b6-bd3a-17e445efb932 and timestamp: 2021-08-19T18:49:28.789000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                      9.71

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2021 6:49:29 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 35 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.001 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.002 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 38.661 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 9s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/qudbdjclxiyiq

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2320

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2320/display/redirect>

Changes:


------------------------------------------
[...truncated 355.20 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1069634906]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2021 12:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2021 12:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2021 12:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 19, 2021 12:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2021 12:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2021 12:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2021 12:45:40 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 19, 2021 12:45:40 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 19, 2021 12:45:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 19, 2021 12:45:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 19, 2021 12:45:46 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 56ef52bf4979e4519419d9c76e0ac12e6176a67be9f8efe49b38c067ba966456> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Vu9Sv0l55FGUGdnHbgrBLmF2pnvp-O_kmzjAZ7qWZFY.pb
    Aug 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 19, 2021 12:45:48 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-B31ygN6Eq3qQlhTqdhDrjh2y-wPKKM8uOVdjtRbbUqk.jar
    Aug 19, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3295957306046285867.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-pEIAux_GoaspQtNcC3DdBLgQCJoNaKK836T2DAa6O80.jar
    Aug 19, 2021 12:45:49 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 19, 2021 12:45:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 19, 2021 12:45:49 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 19, 2021 12:45:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-19_05_45_50-8947413825275807582?project=apache-beam-testing
    Aug 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-19_05_45_50-8947413825275807582
    Aug 19, 2021 12:45:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-19_05_45_50-8947413825275807582
    Aug 19, 2021 12:45:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-19T12:45:53.970Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 19, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:01.060Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 19, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:01.812Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 19, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:01.851Z: Expanding GroupByKey operations into optimizable parts.
    Aug 19, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:01.884Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 19, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:01.958Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 19, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:01.988Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 19, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:02.017Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 19, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:02.054Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 19, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:02.380Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:02.456Z: Starting 5 workers in us-central1-c...
    Aug 19, 2021 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:32.383Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 19, 2021 12:46:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:46:47.351Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 19, 2021 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:47:11.526Z: Workers have started successfully.
    Aug 19, 2021 12:47:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:47:11.559Z: Workers have started successfully.
    Aug 19, 2021 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:47:42.880Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2021 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:47:43.027Z: Cleaning up.
    Aug 19, 2021 12:47:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:47:43.108Z: Stopping worker pool...
    Aug 19, 2021 12:50:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:50:00.629Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 19, 2021 12:50:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T12:50:00.666Z: Worker pool stopped.
    Aug 19, 2021 12:50:07 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-19_05_45_50-8947413825275807582 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): edcc3cc0-34f1-4dbe-84b0-0fbe4b37c9c9 and timestamp: 2021-08-19T12:50:07.275000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.062

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2021 12:50:07 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 37.493 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 46s
152 actionable tasks: 102 executed, 50 from cache

Publishing build scan...
https://gradle.com/s/5mlyb2i5ookjw

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2319

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2319/display/redirect>

Changes:


------------------------------------------
[...truncated 347.30 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@888462796]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 19, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2021 6:44:54 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 19, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 19, 2021 6:44:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 19, 2021 6:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 19, 2021 6:44:58 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash d1a5b00d07247243fc6e1e70e3a7710b7552af284eae2ea78e9d21b1da5a27b3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0aWwDQckckP8bh5w46dxC3VSryhOri6njp0hsdpaJ7M.pb
    Aug 19, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 19, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-B31ygN6Eq3qQlhTqdhDrjh2y-wPKKM8uOVdjtRbbUqk.jar
    Aug 19, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5321523547301347978.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-PkgCQw0iZqNqYWCpIE5U99nodkG0iP6UgyAvHJbCjwQ.jar
    Aug 19, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 19, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 19, 2021 6:45:01 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 19, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 19, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 19, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 19, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 19, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-18_23_45_01-16698520255030861927?project=apache-beam-testing
    Aug 19, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-18_23_45_01-16698520255030861927
    Aug 19, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-18_23_45_01-16698520255030861927
    Aug 19, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-19T06:45:06.959Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 19, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:13.877Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 19, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:14.508Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 19, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:14.598Z: Expanding GroupByKey operations into optimizable parts.
    Aug 19, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:14.630Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 19, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:14.761Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 19, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:14.822Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 19, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:14.866Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 19, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:14.920Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 19, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:15.391Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:15.486Z: Starting 5 workers in us-central1-c...
    Aug 19, 2021 6:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:19.969Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 19, 2021 6:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:45:59.728Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 19, 2021 6:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:46:27.377Z: Workers have started successfully.
    Aug 19, 2021 6:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:46:27.429Z: Workers have started successfully.
    Aug 19, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:46:59.750Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2021 6:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:46:59.979Z: Cleaning up.
    Aug 19, 2021 6:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:47:00.093Z: Stopping worker pool...
    Aug 19, 2021 6:49:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:49:17.478Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 19, 2021 6:49:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T06:49:17.546Z: Worker pool stopped.
    Aug 19, 2021 6:49:23 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-18_23_45_01-16698520255030861927 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 070d64ae-3577-4928-ba76-017b53d7fa08 and timestamp: 2021-08-19T06:49:23.231000000Z:
                     Metric:                    Value:
                   read_time                     9.337
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2021 6:49:23 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 37.878 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 6s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/dr6j4iggoqgja

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2318

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2318/display/redirect?page=changes>

Changes:

[heejong] [BEAM-12735] Adding Python XLang examples to the RC validation script

[heejong] update

[heejong] check xlang test outputs

[noreply] [BEAM-3713] Cleanup, remove nosetest references (#15245)

[noreply] [BEAM-12772] Remove test implementation of SamzaIORegistrar (#15347)


------------------------------------------
[...truncated 364.25 KB...]
    Aug 19, 2021 12:46:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 19, 2021 12:46:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 19, 2021 12:46:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 19, 2021 12:46:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 19, 2021 12:46:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 19, 2021 12:46:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 19, 2021 12:46:20 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 19, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash f66b71f9f9005cd687fde19576925ef2fb2c8d3a9ef0f344d191cd46fe96add3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9mtx-fkAXNaH_eGVdpJe8vssjTqe8PNE0ZHNRv6WrdM.pb
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-B31ygN6Eq3qQlhTqdhDrjh2y-wPKKM8uOVdjtRbbUqk.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1398869350434906037.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-VuNQS7LXJa68Om8qQAbCgr6Tec0nuT19n5sg-rT1Vvo.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-bigtable-emulator/0.125.2/e2c4eccdc638e5883b658a222b99a318a817f3c6/google-cloud-bigtable-emulator-0.125.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-bigtable-emulator-0.125.2-FiUK-2Jw2KpBfAi4-J15Ft5rFwkLvGw0DsE7fz_A75M.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-api/3.2.7/81408fc988c229ea11354fee9902c47842343f04/docker-java-api-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-api-3.2.7-AwtXDacySf3gUHoixVjjNqW0wUI0YcGJjMO-N7kxBho.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.15.1/ccd374ca7e5d8b06fb31ecffeb03abc42feada84/kafka-1.15.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.15.1-Z4qZ3jdWXDvdB7xD9VivkdVQiRZWNWUhPSIANVCSZDA.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.15.1/91e6dfab8f141f77c6a0dd147a94bd186993a22c/testcontainers-1.15.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.15.1-_fMrLpRXSowMnAISRc73bMk5UqrXNnsv6vkPXOZRXu0.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport-zerodep/3.2.7/d74cb05fabc57e9ec441dc39d0bc58ad649fad3d/docker-java-transport-zerodep-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-zerodep-3.2.7-O2a9rUb-899yDZMu4E8j9ItFOkNoggqbLP2xaLmOwJY.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_20_0/0.1/6d16a59dc771784789116607a04acd9a0ef91d16/beam-vendor-calcite-1_20_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_20_0-0.1-1NrX_9FNKiEqNk5qBOaRlj-IwqOvKvQIGIbTVgm_v8Y.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Aug 19, 2021 12:46:23 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport/3.2.7/315903a129f530422747efc163dd255f0fa2555e/docker-java-transport-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-3.2.7-ynNXlSXhDSyiLCXkA4c9E9_HGeKB7-wbpEzmswFrI-A.jar
    Aug 19, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 235 files cached, 13 files newly uploaded in 2 seconds
    Aug 19, 2021 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 19, 2021 12:46:26 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 19, 2021 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 19, 2021 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 19, 2021 12:46:26 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 19, 2021 12:46:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 19, 2021 12:46:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-18_17_46_26-9829944942438848266?project=apache-beam-testing
    Aug 19, 2021 12:46:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-18_17_46_26-9829944942438848266
    Aug 19, 2021 12:46:27 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-18_17_46_26-9829944942438848266
    Aug 19, 2021 12:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-19T00:46:30.142Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 19, 2021 12:46:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:46:37.644Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 19, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:46:38.411Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 19, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:46:38.456Z: Expanding GroupByKey operations into optimizable parts.
    Aug 19, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:46:38.481Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 19, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:46:38.564Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 19, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:46:38.607Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 19, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:46:38.644Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 19, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:46:38.676Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 19, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:46:39.099Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2021 12:46:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:46:39.196Z: Starting 5 workers in us-central1-c...
    Aug 19, 2021 12:47:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:47:08.586Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 19, 2021 12:47:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:47:18.749Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 19, 2021 12:47:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:47:44.385Z: Workers have started successfully.
    Aug 19, 2021 12:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:47:44.425Z: Workers have started successfully.
    Aug 19, 2021 12:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:48:13.995Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 19, 2021 12:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:48:14.194Z: Cleaning up.
    Aug 19, 2021 12:48:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:48:14.271Z: Stopping worker pool...
    Aug 19, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:50:30.152Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 19, 2021 12:50:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-19T00:50:30.204Z: Worker pool stopped.
    Aug 19, 2021 12:51:01 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-18_17_46_26-9829944942438848266 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): df1eaead-66db-428b-afe7-8e384ca796e2 and timestamp: 2021-08-19T00:51:01.805000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.678

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 19, 2021 12:51:02 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 5 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 54.234 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 42s
152 actionable tasks: 106 executed, 46 from cache

Publishing build scan...
https://gradle.com/s/yk4pe2a3elozu

Stopped 4 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2317

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2317/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-12740] [BEAM-8268] Improve error handling and retries for GCS

[aromanenko.dev] [BEAM-12270] Add Parquet source support for TPD-DS benchmark

[noreply] [BEAM-12742] Samza Runner does not properly delete modified timer


------------------------------------------
[...truncated 362.58 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1447309426]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2021 6:53:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2021 6:53:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2021 6:53:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2021 6:53:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2021 6:53:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2021 6:53:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2021 6:53:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 18, 2021 6:53:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 18, 2021 6:53:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 18, 2021 6:53:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 18, 2021 6:53:36 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115848 bytes, hash 6da1c259de9b338f713bb59bf8f24f8c8952342a7f2092228f871a3052d6be59> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-baHCWd6bM49xO7Wb-PJPjIlSNCp_IJIij4caMFLWvlk.pb
    Aug 18, 2021 6:53:43 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 18, 2021 6:53:43 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-B31ygN6Eq3qQlhTqdhDrjh2y-wPKKM8uOVdjtRbbUqk.jar
    Aug 18, 2021 6:53:44 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8891981041788860536.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lSqw4SPmXtttdGljyysfklqv4DYXockiHmXoPsQXs2g.jar
    Aug 18, 2021 6:53:49 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 5 seconds
    Aug 18, 2021 6:53:49 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 18, 2021 6:53:49 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 18, 2021 6:53:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 18, 2021 6:53:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 18, 2021 6:53:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 18, 2021 6:53:51 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 18, 2021 6:53:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-18_11_53_51-9349147923176117060?project=apache-beam-testing
    Aug 18, 2021 6:53:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-18_11_53_51-9349147923176117060
    Aug 18, 2021 6:53:52 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-18_11_53_51-9349147923176117060
    Aug 18, 2021 6:53:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-18T18:53:55.110Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 18, 2021 6:54:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:01.260Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 18, 2021 6:54:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:01.912Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 18, 2021 6:54:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:01.958Z: Expanding GroupByKey operations into optimizable parts.
    Aug 18, 2021 6:54:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:02.019Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 18, 2021 6:54:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:02.108Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 18, 2021 6:54:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:02.163Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 18, 2021 6:54:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:02.200Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 18, 2021 6:54:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:02.234Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 18, 2021 6:54:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:02.655Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2021 6:54:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:02.749Z: Starting 5 workers in us-central1-c...
    Aug 18, 2021 6:54:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:16.310Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 18, 2021 6:54:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:54:48.725Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 18, 2021 6:55:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:55:16.094Z: Workers have started successfully.
    Aug 18, 2021 6:55:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:55:16.140Z: Workers have started successfully.
    Aug 18, 2021 6:55:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:55:47.965Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2021 6:55:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:55:48.139Z: Cleaning up.
    Aug 18, 2021 6:55:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:55:48.232Z: Stopping worker pool...
    Aug 18, 2021 6:58:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:58:04.495Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 18, 2021 6:58:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T18:58:04.540Z: Worker pool stopped.
    Aug 18, 2021 6:58:10 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-18_11_53_51-9349147923176117060 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d958c59e-9db9-4b90-b9cd-161330ae14f4 and timestamp: 2021-08-18T18:58:10.779000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.059

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2021 6:58:12 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.144 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.176 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 5 mins 31.325 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 21s
152 actionable tasks: 99 executed, 53 from cache

Publishing build scan...
https://gradle.com/s/kdtebqptr3ioe

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2316

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2316/display/redirect>

Changes:


------------------------------------------
[...truncated 348.19 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2021 12:45:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 18, 2021 12:45:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 18, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 18, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 18, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 727659362d738f27b1aa570c18bc88ebed03a89035e7c1306a251d2e20e23dd4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cnZZNi1zjyexqlcMGLyI6-0DqJA158EwaiUdLiDiPdQ.pb
    Aug 18, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 18, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7964964245405254826.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lLPnW_Qq5zQPGRjaHPZ5cbqRItp47depodOk4SupmGs.jar
    Aug 18, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-B31ygN6Eq3qQlhTqdhDrjh2y-wPKKM8uOVdjtRbbUqk.jar
    Aug 18, 2021 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 18, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 18, 2021 12:45:12 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 18, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 18, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 18, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 18, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 18, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-18_05_45_13-5386210059395018708?project=apache-beam-testing
    Aug 18, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-18_05_45_13-5386210059395018708
    Aug 18, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-18_05_45_13-5386210059395018708
    Aug 18, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-18T12:45:16.597Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 18, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:45:23.924Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:45:26.438Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:45:26.473Z: Expanding GroupByKey operations into optimizable parts.
    Aug 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:45:26.511Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:45:26.577Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:45:26.613Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:45:26.650Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:45:26.676Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:45:27.063Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2021 12:45:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:45:27.146Z: Starting 5 workers in us-central1-c...
    Aug 18, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:45:34.994Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 18, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:46:09.845Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 18, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:46:37.293Z: Workers have started successfully.
    Aug 18, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:46:37.326Z: Workers have started successfully.
    Aug 18, 2021 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:47:09.677Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2021 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:47:09.834Z: Cleaning up.
    Aug 18, 2021 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:47:09.907Z: Stopping worker pool...
    Aug 18, 2021 12:49:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:49:25.587Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 18, 2021 12:49:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T12:49:25.633Z: Worker pool stopped.
    Aug 18, 2021 12:49:33 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-18_05_45_13-5386210059395018708 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7ce9309b-5f41-4b72-81db-a60968c03dd6 and timestamp: 2021-08-18T12:49:33.056000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.474

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2021 12:49:33 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 37.262 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 9s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
Publishing build scan failed due to network error 'java.net.SocketTimeoutException: Read timed out' (2 retries remaining)...
https://gradle.com/s/mtxufeiudbimw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2315

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2315/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-5379] Go modules basic support and upgrade to Go 1.16.5.

[daniel.o.programmer] [BEAM-5379] Go Jenkins tests adjusted to use modules.

[daniel.o.programmer] [BEAM-5379] Add /v2/ to go module path and update references.

[daniel.o.programmer] [BEAM-5379] Fix Go SDK proto generation, and update protos.

[daniel.o.programmer] [BEAM-5379] Go module work fixup.


------------------------------------------
[...truncated 365.45 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2021 6:46:33 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2021 6:46:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 18, 2021 6:46:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 18, 2021 6:46:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 18, 2021 6:46:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 18, 2021 6:46:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 0ad1b950844c3aba1d4893ca5b7025b1fd22b12676a46e660316de9789a77744> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-CtG5UIRMOrodSJPKW3Alsf0isSZ2pG5mAxbel4mnd0Q.pb
    Aug 18, 2021 6:46:39 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 18, 2021 6:46:39 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-B31ygN6Eq3qQlhTqdhDrjh2y-wPKKM8uOVdjtRbbUqk.jar
    Aug 18, 2021 6:46:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9019943434918061148.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-9-Fd05ng9ZwFEy955ewyqREZUrtVae_z4rqmWnxhCRM.jar
    Aug 18, 2021 6:46:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 18, 2021 6:46:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 18, 2021 6:46:40 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 18, 2021 6:46:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 18, 2021 6:46:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 18, 2021 6:46:40 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 18, 2021 6:46:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 18, 2021 6:46:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-17_23_46_40-17551370218448594561?project=apache-beam-testing
    Aug 18, 2021 6:46:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-17_23_46_40-17551370218448594561
    Aug 18, 2021 6:46:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-17_23_46_40-17551370218448594561
    Aug 18, 2021 6:46:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-18T06:46:44.399Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 18, 2021 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:46:49.374Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 18, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:46:50.036Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 18, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:46:50.076Z: Expanding GroupByKey operations into optimizable parts.
    Aug 18, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:46:50.106Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 18, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:46:50.163Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 18, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:46:50.201Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 18, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:46:50.236Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 18, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:46:50.258Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 18, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:46:50.581Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2021 6:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:46:50.659Z: Starting 5 workers in us-central1-c...
    Aug 18, 2021 6:47:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:47:14.134Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 18, 2021 6:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:47:35.950Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 18, 2021 6:48:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:48:01.542Z: Workers have started successfully.
    Aug 18, 2021 6:48:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:48:01.639Z: Workers have started successfully.
    Aug 18, 2021 6:48:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:48:31.399Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2021 6:48:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:48:31.559Z: Cleaning up.
    Aug 18, 2021 6:48:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:48:31.653Z: Stopping worker pool...
    Aug 18, 2021 6:50:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:50:49.447Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 18, 2021 6:50:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T06:50:49.496Z: Worker pool stopped.
    Aug 18, 2021 6:50:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-17_23_46_40-17551370218448594561 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d321ab62-fed7-45ca-9a5e-042798fba3c5 and timestamp: 2021-08-18T06:50:56.403000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.801

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2021 6:50:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 5 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 34.576 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 36s
152 actionable tasks: 106 executed, 46 from cache

Publishing build scan...
https://gradle.com/s/gei6oq6dru7ua

Stopped 4 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2314

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2314/display/redirect?page=changes>

Changes:

[ruane.jb] Fix package containing RestrictionProvider

[randomstep] BEAM-12635 Bump Apache Compress to 1.21

[noreply] Fix line length

[noreply] Fix whitespace.

[noreply] Take formatters suggestion

[noreply] [BEAM-11088] Clean up incorrect comment (#15345)

[noreply] [BEAM-10955] Update Flink minor versions and enable testSavepointRest…

[noreply] Fix grpc data read thread block with finished instruction_id in


------------------------------------------
[...truncated 356.63 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2021 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2021 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2021 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 18, 2021 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 18, 2021 12:45:27 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 18, 2021 12:45:27 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 18, 2021 12:45:28 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 18, 2021 12:45:28 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 18, 2021 12:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 18, 2021 12:45:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 18, 2021 12:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash e16a88c5f54f2f6bbf1fd8faf76cc500488197bd6f604714cd9af4ff836a706b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4WqIxfVPL2u_H9j692zFAEiBl71vYEcUzZr0_4NqcGs.pb
    Aug 18, 2021 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 18, 2021 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-KS-F6sv5oMHnh51ALIUaW_EjVBOwzNQmWOBtHs4zmwA.jar
    Aug 18, 2021 12:45:34 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8457999759773835795.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-203ZXkpl_lvqFnEoH05lKcdn7K8DjKccY-lOxlFwaFA.jar
    Aug 18, 2021 12:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 18, 2021 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 18, 2021 12:45:35 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 18, 2021 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 18, 2021 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 18, 2021 12:45:35 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 18, 2021 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 18, 2021 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-17_17_45_35-9061007386102785355?project=apache-beam-testing
    Aug 18, 2021 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-17_17_45_35-9061007386102785355
    Aug 18, 2021 12:45:36 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-17_17_45_35-9061007386102785355
    Aug 18, 2021 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-18T00:45:39.067Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 18, 2021 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:45:45.612Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 18, 2021 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:45:46.291Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 18, 2021 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:45:46.336Z: Expanding GroupByKey operations into optimizable parts.
    Aug 18, 2021 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:45:46.369Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 18, 2021 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:45:46.437Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 18, 2021 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:45:46.474Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 18, 2021 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:45:46.500Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 18, 2021 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:45:46.534Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 18, 2021 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:45:46.887Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2021 12:45:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:45:46.980Z: Starting 5 workers in us-central1-c...
    Aug 18, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:46:19.322Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 18, 2021 12:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:46:30.692Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 18, 2021 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:46:56.903Z: Workers have started successfully.
    Aug 18, 2021 12:46:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:46:56.933Z: Workers have started successfully.
    Aug 18, 2021 12:47:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:47:30.081Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 18, 2021 12:47:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:47:30.309Z: Cleaning up.
    Aug 18, 2021 12:47:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:47:30.419Z: Stopping worker pool...
    Aug 18, 2021 12:49:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:49:46.226Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 18, 2021 12:49:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-18T00:49:46.284Z: Worker pool stopped.
    Aug 18, 2021 12:49:52 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-17_17_45_35-9061007386102785355 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f189f742-f226-41f9-a1a9-bfcbdfe72f74 and timestamp: 2021-08-18T00:49:52.984000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.887

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 18, 2021 12:49:53 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 35.281 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 33s
152 actionable tasks: 102 executed, 50 from cache

Publishing build scan...
https://gradle.com/s/m3nad37nxjmxc

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2313

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2313/display/redirect?page=changes>

Changes:

[Jan Lukavský] [BEAM-12710] Disable TestStreamTest.testFirstElementLate for portable

[aromanenko.dev] [BEAM-12429] Add support for S3 Bucket Key at the object level

[noreply] [BEAM-12755] Stop throwing exceptions during formatting and calculation

[noreply] [BEAM-12734] Autopopulate opts using Go flags. (#15311)

[noreply] [BEAM-12768] Make the test less strict and instead match on substring


------------------------------------------
[...truncated 359.81 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2021 6:58:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2021 6:58:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2021 6:58:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2021 6:58:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2021 6:58:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2021 6:58:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2021 6:58:06 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 17, 2021 6:58:06 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 17, 2021 6:58:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 17, 2021 6:58:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 17, 2021 6:58:15 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 5b3979720592263cfaed0a679be284c830d67fb0c09218a3c9a073fa8bfe8434> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Wzl5cgWSJjz67Qpnm-KEyDDWf7DAkhijyaBz-ov-hDQ.pb
    Aug 17, 2021 6:58:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 17, 2021 6:58:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-KS-F6sv5oMHnh51ALIUaW_EjVBOwzNQmWOBtHs4zmwA.jar
    Aug 17, 2021 6:58:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6368025026639234533.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ECqsSrg8XwKqgF5Lr4vomUs2Nfw1bjB1lH4Ma7ZFdrI.jar
    Aug 17, 2021 6:58:29 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 6 seconds
    Aug 17, 2021 6:58:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 17, 2021 6:58:30 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 17, 2021 6:58:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 17, 2021 6:58:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 17, 2021 6:58:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 17, 2021 6:58:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 17, 2021 6:58:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-17_11_58_32-4373015158718116655?project=apache-beam-testing
    Aug 17, 2021 6:58:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-17_11_58_32-4373015158718116655
    Aug 17, 2021 6:58:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-17_11_58_32-4373015158718116655
    Aug 17, 2021 6:58:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-17T18:58:35.793Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 17, 2021 6:59:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T18:59:04.544Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 17, 2021 6:59:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T18:59:14.359Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 17, 2021 6:59:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T18:59:15.203Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 17, 2021 6:59:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T18:59:15.257Z: Expanding GroupByKey operations into optimizable parts.
    Aug 17, 2021 6:59:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T18:59:15.296Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 17, 2021 6:59:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T18:59:15.366Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 17, 2021 6:59:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T18:59:15.401Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 17, 2021 6:59:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T18:59:15.434Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 17, 2021 6:59:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T18:59:15.468Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 17, 2021 6:59:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T18:59:15.959Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2021 6:59:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T18:59:16.035Z: Starting 5 workers in us-central1-c...
    Aug 17, 2021 7:03:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T19:03:55.588Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Aug 17, 2021 7:03:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T19:03:55.849Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Aug 17, 2021 7:04:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T19:04:06.233Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 17, 2021 7:04:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T19:04:31.308Z: Workers have started successfully.
    Aug 17, 2021 7:04:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T19:04:31.358Z: Workers have started successfully.
    Aug 17, 2021 7:05:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T19:05:02.660Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2021 7:05:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T19:05:03.008Z: Cleaning up.
    Aug 17, 2021 7:05:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T19:05:03.142Z: Stopping worker pool...
    Aug 17, 2021 7:07:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T19:07:17.171Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 17, 2021 7:07:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T19:07:17.210Z: Worker pool stopped.
    Aug 17, 2021 7:07:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-17_11_58_32-4373015158718116655 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 10658942-a8a1-4b15-8e31-f99b5834e613 and timestamp: 2021-08-17T19:07:27.727000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.848

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2021 7:07:28 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.085 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.07 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 10 mins 0.969 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 13m 34s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/bxg4xhvmcvvby

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2312

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2312/display/redirect>

Changes:


------------------------------------------
[...truncated 347.03 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 17, 2021 12:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 17, 2021 12:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 17, 2021 12:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 17, 2021 12:44:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115908 bytes, hash cd0f7c843793da2d661f7d9d2dcb5fbfc4f763ba8b7e2cba6f2974c3d8a3b042> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-zQ98hDeT2i1mH32dLctfv8T3Y7qLfiy6byl0w9ijsEI.pb
    Aug 17, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 17, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-KS-F6sv5oMHnh51ALIUaW_EjVBOwzNQmWOBtHs4zmwA.jar
    Aug 17, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3841556149979231988.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test--Dl-cDzE3YBCcL7gGkHIUhR-Q2LlWc7tdVCmAtSo78A.jar
    Aug 17, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 17, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 17, 2021 12:45:02 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 17, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 17, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 17, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 17, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 17, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-17_05_45_02-3639656065009851551?project=apache-beam-testing
    Aug 17, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-17_05_45_02-3639656065009851551
    Aug 17, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-17_05_45_02-3639656065009851551
    Aug 17, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-17T12:45:06.319Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:15.746Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:16.442Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:16.481Z: Expanding GroupByKey operations into optimizable parts.
    Aug 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:16.513Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:16.584Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:16.613Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:16.647Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:16.679Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:17.079Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:17.157Z: Starting 5 workers in us-central1-c...
    Aug 17, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:38.223Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 17, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:45:56.354Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 17, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:46:21.552Z: Workers have started successfully.
    Aug 17, 2021 12:46:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:46:21.590Z: Workers have started successfully.
    Aug 17, 2021 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:46:52.840Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2021 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:46:53.048Z: Cleaning up.
    Aug 17, 2021 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:46:53.135Z: Stopping worker pool...
    Aug 17, 2021 12:49:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:49:10.577Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 17, 2021 12:49:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T12:49:10.623Z: Worker pool stopped.
    Aug 17, 2021 12:49:17 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-17_05_45_02-3639656065009851551 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 72b3f82b-ca1b-416d-8d82-83f9efb01036 and timestamp: 2021-08-17T12:49:17.111000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.304

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2021 12:49:17 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 30.713 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 59s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/mrov2t2kk4pwk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2311

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2311/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11088] TestStream utility and testing improvements (#15320)


------------------------------------------
[...truncated 353.93 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2021 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2021 6:45:24 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 17, 2021 6:45:25 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 17, 2021 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 17, 2021 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 17, 2021 6:45:29 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash bc59b9a98883e31498aabbcad56b51c752383d11bf3ef6840a338ff7ad9803fe> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vFm5qYiD4xSYqrvK1WtRx1I4PRG_PvaECjOP962YA_4.pb
    Aug 17, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 17, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-KS-F6sv5oMHnh51ALIUaW_EjVBOwzNQmWOBtHs4zmwA.jar
    Aug 17, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6233236780142514680.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-h5YloY11t30LpjR0kXykqAb0rRrSUsFSTZG0fzybZDM.jar
    Aug 17, 2021 6:45:32 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 17, 2021 6:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 17, 2021 6:45:32 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 17, 2021 6:45:32 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 17, 2021 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 17, 2021 6:45:33 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 17, 2021 6:45:33 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 17, 2021 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-16_23_45_33-18138878207629902660?project=apache-beam-testing
    Aug 17, 2021 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-16_23_45_33-18138878207629902660
    Aug 17, 2021 6:45:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-16_23_45_33-18138878207629902660
    Aug 17, 2021 6:45:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-17T06:45:36.604Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 17, 2021 6:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:45:43.204Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 17, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:45:43.823Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 17, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:45:43.854Z: Expanding GroupByKey operations into optimizable parts.
    Aug 17, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:45:43.887Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 17, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:45:43.967Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 17, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:45:44.007Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 17, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:45:44.043Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 17, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:45:44.088Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 17, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:45:44.455Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:45:44.541Z: Starting 5 workers in us-central1-c...
    Aug 17, 2021 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:45:57.148Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 17, 2021 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:46:19.728Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 17, 2021 6:46:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:46:19.756Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 17, 2021 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:46:30.030Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 17, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:46:54.707Z: Workers have started successfully.
    Aug 17, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:46:54.744Z: Workers have started successfully.
    Aug 17, 2021 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:47:25.659Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2021 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:47:25.837Z: Cleaning up.
    Aug 17, 2021 6:47:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:47:25.913Z: Stopping worker pool...
    Aug 17, 2021 6:49:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:49:38.970Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 17, 2021 6:49:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T06:49:39.055Z: Worker pool stopped.
    Aug 17, 2021 6:49:46 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-16_23_45_33-18138878207629902660 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): cdebcfad-0197-4b3f-b95b-ec0ba2460c4e and timestamp: 2021-08-17T06:49:46.774000000Z:
                     Metric:                    Value:
                   read_time                     8.486
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2021 6:49:47 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 34.059 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 26s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/zcd5kn2qmlo6c

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2310

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2310/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-12609] Make type parameter part of PushdownProjector interface.

[pascal.gillet] [BEAM-12479] Fixes UnsupportedOperationException

[baeminbo] [BEAM-12504] Make CreateTransaction wait on input signal

[Robert Burke] redundant build


------------------------------------------
[...truncated 353.86 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2021 12:52:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2021 12:52:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2021 12:52:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 17, 2021 12:52:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 17, 2021 12:52:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 17, 2021 12:52:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 17, 2021 12:52:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 17, 2021 12:52:32 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 17, 2021 12:52:32 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 17, 2021 12:52:37 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 17, 2021 12:52:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash a060d2fd91a480e226513dc4c5679fc435aa9047fa1b7f95936227811644baad> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-oGDS_ZGkgOImUT3ExWefxDWqkEf6G3-Vk2IngRZEuq0.pb
    Aug 17, 2021 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 17, 2021 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-KS-F6sv5oMHnh51ALIUaW_EjVBOwzNQmWOBtHs4zmwA.jar
    Aug 17, 2021 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.33.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.33.0-SNAPSHOT-tests-hPTh8ORAQSn0U3ZY9s7ryokdtEksosa2jOx4HFf7w2I.jar
    Aug 17, 2021 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2914103233911604062.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-gpFukvjNNJH7I5u-_4r70HbNHwxp5t6uiONnbBQvnVo.jar
    Aug 17, 2021 12:52:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.33.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.33.0-SNAPSHOT-9snRLlhv_9RNHAtkdzEwt0C9GxyhH_eJndGOqadZfm4.jar
    Aug 17, 2021 12:52:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 245 files cached, 3 files newly uploaded in 0 seconds
    Aug 17, 2021 12:52:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 17, 2021 12:52:41 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 17, 2021 12:52:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 17, 2021 12:52:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 17, 2021 12:52:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 17, 2021 12:52:41 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 17, 2021 12:52:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-16_17_52_41-6501503531420061983?project=apache-beam-testing
    Aug 17, 2021 12:52:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-16_17_52_41-6501503531420061983
    Aug 17, 2021 12:52:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-16_17_52_41-6501503531420061983
    Aug 17, 2021 12:52:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-17T00:52:44.908Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 17, 2021 12:52:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:52:52.298Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 17, 2021 12:52:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:52:52.985Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 17, 2021 12:52:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:52:53.060Z: Expanding GroupByKey operations into optimizable parts.
    Aug 17, 2021 12:52:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:52:53.107Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 17, 2021 12:52:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:52:53.193Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 17, 2021 12:52:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:52:53.238Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 17, 2021 12:52:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:52:53.290Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 17, 2021 12:52:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:52:53.326Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 17, 2021 12:52:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:52:53.844Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2021 12:52:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:52:53.933Z: Starting 5 workers in us-central1-c...
    Aug 17, 2021 12:53:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:53:14.698Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 17, 2021 12:53:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:53:49.960Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 17, 2021 12:54:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:54:16.817Z: Workers have started successfully.
    Aug 17, 2021 12:54:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:54:16.869Z: Workers have started successfully.
    Aug 17, 2021 12:54:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:54:44.787Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 17, 2021 12:54:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:54:45.105Z: Cleaning up.
    Aug 17, 2021 12:54:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:54:45.296Z: Stopping worker pool...
    Aug 17, 2021 12:56:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:56:54.560Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 17, 2021 12:56:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-17T00:56:54.631Z: Worker pool stopped.
    Aug 17, 2021 12:57:00 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-16_17_52_41-6501503531420061983 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): cbaf5084-4c77-4f25-a7b0-9fff1b954e7c and timestamp: 2021-08-17T00:57:00.837000000Z:
                     Metric:                    Value:
                   read_time                     9.922
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 17, 2021 12:57:01 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 38.905 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 30s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/cpaupgvencsyw

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2309

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2309/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-12619] Swap LinkedBlockingQueue to ArrayBlockingQueue for minor

[Luke Cwik] fixup! Fix spotbugs warning

[Jeremy Quinn] Add AWS services as a runtime dependency to support S3

[Andrew Pilloud] [BEAM-12759] ORDER BY then SELECT

[noreply] [BEAM-7745] Avoid uncached state fetches for streaming side-inputs


------------------------------------------
[...truncated 349.83 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2021 6:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2021 6:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2021 6:46:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2021 6:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2021 6:46:45 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2021 6:46:45 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2021 6:46:46 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 16, 2021 6:46:46 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 16, 2021 6:46:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 16, 2021 6:46:50 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 16, 2021 6:46:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 011ee2c244f6107ab3378a32e161dedfffc714a935def72127efea3bd1ed1273> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-AR7iwkT2EHqzN4oy4WHe3__HFKk13vchJ-_qO9HtEnM.pb
    Aug 16, 2021 6:46:53 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 16, 2021 6:46:53 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-KS-F6sv5oMHnh51ALIUaW_EjVBOwzNQmWOBtHs4zmwA.jar
    Aug 16, 2021 6:46:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9069222020655701572.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-z8Y3B3VWVwzSZFCPshC-FLE76j3cwjushZ1y99KKbU8.jar
    Aug 16, 2021 6:46:54 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 16, 2021 6:46:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 16, 2021 6:46:54 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 16, 2021 6:46:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 16, 2021 6:46:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 16, 2021 6:46:54 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 16, 2021 6:46:54 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 16, 2021 6:46:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-16_11_46_54-11846946463664151148?project=apache-beam-testing
    Aug 16, 2021 6:46:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-16_11_46_54-11846946463664151148
    Aug 16, 2021 6:46:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-16_11_46_54-11846946463664151148
    Aug 16, 2021 6:46:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-16T18:46:58.465Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 16, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:05.578Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 16, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:06.154Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 16, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:06.197Z: Expanding GroupByKey operations into optimizable parts.
    Aug 16, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:06.223Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 16, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:06.298Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 16, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:06.327Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 16, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:06.358Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 16, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:06.387Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 16, 2021 6:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:06.745Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2021 6:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:06.826Z: Starting 5 workers in us-central1-c...
    Aug 16, 2021 6:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:13.421Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 16, 2021 6:47:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:47:56.093Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 16, 2021 6:48:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:48:20.582Z: Workers have started successfully.
    Aug 16, 2021 6:48:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:48:20.617Z: Workers have started successfully.
    Aug 16, 2021 6:48:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:48:51.809Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2021 6:48:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:48:51.964Z: Cleaning up.
    Aug 16, 2021 6:48:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:48:52.054Z: Stopping worker pool...
    Aug 16, 2021 6:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:51:11.320Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 16, 2021 6:51:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T18:51:11.361Z: Worker pool stopped.
    Aug 16, 2021 6:51:18 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-16_11_46_54-11846946463664151148 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 55db4f69-7934-406b-86b8-e8632faa8318 and timestamp: 2021-08-16T18:51:18.670000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.025

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2021 6:51:19 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 43.665 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 49s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/gt5yw7rcb4v36

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2308

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2308/display/redirect>

Changes:


------------------------------------------
[...truncated 347.68 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2021 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2021 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2021 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2021 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2021 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2021 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2021 12:45:09 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 16, 2021 12:45:09 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 16, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 16, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 16, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash 3cb1b53a323cb256f3cf18a600c70e6a8a179012064a5b7d4e2c8f223cb72af1> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-PLG1OjI8slbzzximAMcOaooXkBIGSlt9TiyPIjy3KvE.pb
    Aug 16, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 16, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 16, 2021 12:45:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3303341892101732943.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Ep0MBR5IZ5Jg_M3UOEGIIBm-Y7HxYZMrWdPXTbS0XNI.jar
    Aug 16, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 1 seconds
    Aug 16, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 16, 2021 12:45:24 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 16, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 16, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 16, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 16, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 16, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-16_05_45_25-7132066470934648140?project=apache-beam-testing
    Aug 16, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-16_05_45_25-7132066470934648140
    Aug 16, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-16_05_45_25-7132066470934648140
    Aug 16, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-16T12:45:28.889Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:45:34.886Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:45:35.476Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:45:35.505Z: Expanding GroupByKey operations into optimizable parts.
    Aug 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:45:35.536Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:45:35.607Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:45:35.632Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:45:35.662Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:45:35.686Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:45:35.995Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:45:36.056Z: Starting 5 workers in us-central1-c...
    Aug 16, 2021 12:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:45:46.987Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 16, 2021 12:46:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:46:19.407Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 16, 2021 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:46:44.645Z: Workers have started successfully.
    Aug 16, 2021 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:46:44.664Z: Workers have started successfully.
    Aug 16, 2021 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:47:15.637Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2021 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:47:15.774Z: Cleaning up.
    Aug 16, 2021 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:47:15.859Z: Stopping worker pool...
    Aug 16, 2021 12:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:49:29.735Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 16, 2021 12:49:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T12:49:29.773Z: Worker pool stopped.
    Aug 16, 2021 12:49:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-16_05_45_25-7132066470934648140 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3697cc41-83a0-43ea-bdee-f6b842783f9d and timestamp: 2021-08-16T12:49:34.990000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    12.069

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2021 12:49:35 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 36.081 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 11s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/4vxaioum5j7iq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2307

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2307/display/redirect>

Changes:


------------------------------------------
[...truncated 348.21 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 16, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 16, 2021 6:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 16, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 16, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash a36bdd26b542acc2c29ec57b1acf89767c0624e4b8f165edd346d32a2e2a9dff> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-o2vdJrVCrMLCnsV7Gs-JdnwGJOS48WXt00bTKi4qnf8.pb
    Aug 16, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 16, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 16, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7210476164538751764.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-nLbBFenaoexwqhZtsBgc3w2DNTeAgDKv8wnSpX7TJz0.jar
    Aug 16, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 16, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 16, 2021 6:45:05 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 16, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 16, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 16, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 16, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 16, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-15_23_45_06-11683026011473734275?project=apache-beam-testing
    Aug 16, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-15_23_45_06-11683026011473734275
    Aug 16, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-15_23_45_06-11683026011473734275
    Aug 16, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-16T06:45:09.669Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 16, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:16.055Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:16.567Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:16.615Z: Expanding GroupByKey operations into optimizable parts.
    Aug 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:16.646Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:16.720Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:16.771Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:16.805Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:16.838Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:17.189Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:17.277Z: Starting 5 workers in us-central1-c...
    Aug 16, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:22.622Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 16, 2021 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:50.871Z: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s).
    Aug 16, 2021 6:45:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:45:50.908Z: Resized worker pool to 4, though goal was 5.  This could be a quota issue.
    Aug 16, 2021 6:46:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:46:01.128Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 16, 2021 6:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:46:25.242Z: Workers have started successfully.
    Aug 16, 2021 6:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:46:25.274Z: Workers have started successfully.
    Aug 16, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:46:54.262Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:46:54.440Z: Cleaning up.
    Aug 16, 2021 6:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:46:54.525Z: Stopping worker pool...
    Aug 16, 2021 6:49:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:49:12.473Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 16, 2021 6:49:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T06:49:12.549Z: Worker pool stopped.
    Aug 16, 2021 6:49:19 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-15_23_45_06-11683026011473734275 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6c4ec82b-d4c8-4cdf-aab5-cb1b46428e23 and timestamp: 2021-08-16T06:49:19.433000000Z:
                     Metric:                    Value:
                   read_time                     8.518
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2021 6:49:19 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 30.036 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 0s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/qtzt2xiqkx5jg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2306

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2306/display/redirect>

Changes:


------------------------------------------
[...truncated 348.18 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 16, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 16, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 16, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 16, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 16, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 16, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 16, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 16, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115908 bytes, hash 189b0d61bdf73457851c5f3b90862b86e6d0f2a551702cf361f43e4ba5ef3b53> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-GJsNYb33NFeFHF87kIYrhubQ8qVRcCzzYfQ-S6XvO1M.pb
    Aug 16, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 16, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 16, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8117525925557128233.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-imE2Nwq-0CzGMJUrOcRjJjM8qhDOsHVqSPlfXTOBbi4.jar
    Aug 16, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 16, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 16, 2021 12:45:05 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 16, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 16, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 16, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 16, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 16, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-15_17_45_05-12150956037802383232?project=apache-beam-testing
    Aug 16, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-15_17_45_05-12150956037802383232
    Aug 16, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-15_17_45_05-12150956037802383232
    Aug 16, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-16T00:45:08.964Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 16, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:14.675Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 16, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:15.493Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 16, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:15.538Z: Expanding GroupByKey operations into optimizable parts.
    Aug 16, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:15.571Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 16, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:15.644Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 16, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:15.678Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 16, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:15.710Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 16, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:15.748Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 16, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:16.139Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:16.220Z: Starting 5 workers in us-central1-c...
    Aug 16, 2021 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:24.473Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 16, 2021 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:45.423Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 16, 2021 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:45.449Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 16, 2021 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:45:55.644Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 16, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:46:20.950Z: Workers have started successfully.
    Aug 16, 2021 12:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:46:20.987Z: Workers have started successfully.
    Aug 16, 2021 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:46:50.533Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 16, 2021 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:46:50.682Z: Cleaning up.
    Aug 16, 2021 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:46:50.749Z: Stopping worker pool...
    Aug 16, 2021 12:49:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:49:01.289Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 16, 2021 12:49:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-16T00:49:01.329Z: Worker pool stopped.
    Aug 16, 2021 12:49:07 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-15_17_45_05-12150956037802383232 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 28990cd6-6c4d-4eee-8aba-b7dbd18d2c87 and timestamp: 2021-08-16T00:49:07.356000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.151

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 16, 2021 12:49:07 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 18.356 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 48s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/kacpqz6ghalgm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2305

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2305/display/redirect>

Changes:


------------------------------------------
[...truncated 347.85 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 15, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 15, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 15, 2021 6:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 15, 2021 6:45:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash b8b54518ad1b9378ec2e4581755af542b844b17a47ded6d8e01aef41a5650005> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-uLVFGK0bk3jsLkWBdVr1QrhEsXpH3tbY4BrvQaVlAAU.pb
    Aug 15, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 15, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 15, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8430995399001611235.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-dazDj47Qejc3GN--g5RVVGIBeqU-_-aW-Is7uDwSWik.jar
    Aug 15, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 1 seconds
    Aug 15, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 15, 2021 6:45:04 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 15, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 15, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 15, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 15, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 15, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-15_11_45_04-14247851697265665497?project=apache-beam-testing
    Aug 15, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-15_11_45_04-14247851697265665497
    Aug 15, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-15_11_45_04-14247851697265665497
    Aug 15, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-15T18:45:08.044Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 15, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:15.612Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 15, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:16.404Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 15, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:16.441Z: Expanding GroupByKey operations into optimizable parts.
    Aug 15, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:16.471Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 15, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:16.533Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 15, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:16.659Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 15, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:16.699Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 15, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:16.759Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 15, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:17.151Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2021 6:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:17.226Z: Starting 5 workers in us-central1-c...
    Aug 15, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:24.366Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 15, 2021 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:46.572Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 15, 2021 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:46.592Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 15, 2021 6:45:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:45:56.820Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 15, 2021 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:46:23.063Z: Workers have started successfully.
    Aug 15, 2021 6:46:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:46:23.097Z: Workers have started successfully.
    Aug 15, 2021 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:46:53.430Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2021 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:46:53.631Z: Cleaning up.
    Aug 15, 2021 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:46:53.706Z: Stopping worker pool...
    Aug 15, 2021 6:49:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:49:16.032Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 15, 2021 6:49:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T18:49:16.067Z: Worker pool stopped.
    Aug 15, 2021 6:49:21 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-15_11_45_04-14247851697265665497 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b9229814-f9c3-47a9-a9e8-3c1261b077c5 and timestamp: 2021-08-15T18:49:21.997000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                      9.31

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2021 6:49:22 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 35.185 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 4s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/kmnddy4rwwoue

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2304

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2304/display/redirect>

Changes:


------------------------------------------
[...truncated 348.77 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 15, 2021 12:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 15, 2021 12:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 15, 2021 12:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 15, 2021 12:45:00 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash ed070392007255f2fc2ed529804f5991914fbe7c36682b53d7ea29e37cfd7da4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7QcDkgByVfL8LtUpgE9ZkZFPvnw2aCtT1-op43z9faQ.pb
    Aug 15, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 15, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 15, 2021 12:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1547073746703909213.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-g_uPD5oIxxx2G6CMK6kFsQnH9bmgXdabgDjR8m02uqQ.jar
    Aug 15, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 15, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 15, 2021 12:45:03 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 15, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 15, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 15, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 15, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 15, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-15_05_45_03-13925535169004776617?project=apache-beam-testing
    Aug 15, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-15_05_45_03-13925535169004776617
    Aug 15, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-15_05_45_03-13925535169004776617
    Aug 15, 2021 12:45:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-15T12:45:07.226Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 15, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:13.114Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 15, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:13.785Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 15, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:13.822Z: Expanding GroupByKey operations into optimizable parts.
    Aug 15, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:13.848Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 15, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:13.925Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 15, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:13.970Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 15, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:13.994Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 15, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:14.026Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 15, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:14.373Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:14.444Z: Starting 5 workers in us-central1-c...
    Aug 15, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:30.073Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 15, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:45:58.181Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 15, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:46:24.732Z: Workers have started successfully.
    Aug 15, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:46:24.755Z: Workers have started successfully.
    Aug 15, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:46:54.583Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:46:54.729Z: Cleaning up.
    Aug 15, 2021 12:46:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:46:54.820Z: Stopping worker pool...
    Aug 15, 2021 12:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:49:21.986Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 15, 2021 12:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T12:49:22.031Z: Worker pool stopped.
    Aug 15, 2021 12:49:27 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-15_05_45_03-13925535169004776617 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 63d67c11-87a2-4821-80e3-f81cb2ae64f1 and timestamp: 2021-08-15T12:49:27.755000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.051

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2021 12:49:28 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 40.23 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 9s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/dso3lmb2t5v5q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2303

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2303/display/redirect>

Changes:


------------------------------------------
[...truncated 347.05 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2021 6:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 15, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 15, 2021 6:44:56 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 15, 2021 6:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 15, 2021 6:44:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115908 bytes, hash 0a7ca97e8f4a221ca90b0a1ee160e9cf09c410cd4230ff2011b380b555d337ab> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Cnypfo9KIhypCwoe4WDpzwnEEM1CMP8gEbOAtVXTN6s.pb
    Aug 15, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 15, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 15, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5652053329230844324.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-WFLxBYPsQCwkMvuFx3_yFAwl3a8TDBt6sc0pfEmNDws.jar
    Aug 15, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 15, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 15, 2021 6:45:02 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 15, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 15, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 15, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 15, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 15, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-14_23_45_03-3468513555950336720?project=apache-beam-testing
    Aug 15, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-14_23_45_03-3468513555950336720
    Aug 15, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-14_23_45_03-3468513555950336720
    Aug 15, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-15T06:45:07.537Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 15, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:13.096Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 15, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:13.885Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 15, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:13.928Z: Expanding GroupByKey operations into optimizable parts.
    Aug 15, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:13.968Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 15, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:14.046Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 15, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:14.073Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 15, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:14.106Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 15, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:14.140Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 15, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:14.501Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:14.569Z: Starting 5 workers in us-central1-c...
    Aug 15, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:24.372Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 15, 2021 6:45:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:45:58.910Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 15, 2021 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:46:25.462Z: Workers have started successfully.
    Aug 15, 2021 6:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:46:25.481Z: Workers have started successfully.
    Aug 15, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:46:56.425Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:46:56.552Z: Cleaning up.
    Aug 15, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:46:56.626Z: Stopping worker pool...
    Aug 15, 2021 6:49:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:49:15.415Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 15, 2021 6:49:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T06:49:15.452Z: Worker pool stopped.
    Aug 15, 2021 6:49:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-14_23_45_03-3468513555950336720 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8e85c3e9-f0e6-4b3e-90b0-15a0eb111e3a and timestamp: 2021-08-15T06:49:20.638000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    11.031

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2021 6:49:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 34.289 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 3s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/sh5zb6pbiv45u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2302

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2302/display/redirect>

Changes:


------------------------------------------
[...truncated 348.59 KB...]
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 15, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 15, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 15, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 15, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 15, 2021 12:45:01 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 15, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 15, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 15, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash bc2bb5e1ab10a085b4590ff8c7efc7cfa0ef0bc6f307b4968265ab4d5236bd0d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-vCu14asQoIW0WQ_4x-_Hz6DvC8bzB7SWgmWrTVI2vQ0.pb
    Aug 15, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 15, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 15, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8033054941235204757.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-dxop04JhJwKwR6OA74gRjF9kDRiqJuJfQLnhHYXEhww.jar
    Aug 15, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.thrift/libthrift/0.14.1/85348a0c44c298bbec5ae747e67ae12e60b3aef6/libthrift-0.14.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/libthrift-0.14.1-WzUQ_nLm8HJeKc7269seq6zMxp15_E7VC2gWAKh2Z-w.jar
    Aug 15, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.tomcat/tomcat-annotations-api/8.5.46/56c67699de192c603afd6f029e80e5ff8d98e7e9/tomcat-annotations-api-8.5.46.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tomcat-annotations-api-8.5.46-amtG0OaVhkRRTAyjZYs7B-YSOmgqIO4203lSQnNfq8M.jar
    Aug 15, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.tomcat.embed/tomcat-embed-core/8.5.46/5d686394334d143f48251827435ab086a161e75e/tomcat-embed-core-8.5.46.jar to gs://temp-storage-for-perf-tests/loadtests/staging/tomcat-embed-core-8.5.46-vl-FREjS7l1uADb-srT3ExYweaG2uaepdQjlWRetNcI.jar
    Aug 15, 2021 12:45:08 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 244 files cached, 4 files newly uploaded in 0 seconds
    Aug 15, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 15, 2021 12:45:08 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 15, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 15, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 15, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 15, 2021 12:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 15, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-14_17_45_08-16367359763360785266?project=apache-beam-testing
    Aug 15, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-14_17_45_08-16367359763360785266
    Aug 15, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-14_17_45_08-16367359763360785266
    Aug 15, 2021 12:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-15T00:45:11.941Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 15, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:45:18.069Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 15, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:45:18.789Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 15, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:45:18.824Z: Expanding GroupByKey operations into optimizable parts.
    Aug 15, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:45:18.863Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 15, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:45:18.974Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 15, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:45:19.016Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 15, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:45:19.042Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 15, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:45:19.073Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 15, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:45:19.410Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2021 12:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:45:19.477Z: Starting 5 workers in us-central1-c...
    Aug 15, 2021 12:46:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:45:59.520Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 15, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:46:06.280Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 15, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:46:25.487Z: Workers have started successfully.
    Aug 15, 2021 12:46:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:46:25.527Z: Workers have started successfully.
    Aug 15, 2021 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:46:56.629Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 15, 2021 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:46:56.791Z: Cleaning up.
    Aug 15, 2021 12:46:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:46:56.860Z: Stopping worker pool...
    Aug 15, 2021 12:49:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:49:11.212Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 15, 2021 12:49:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-15T00:49:11.264Z: Worker pool stopped.
    Aug 15, 2021 12:49:19 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-14_17_45_08-16367359763360785266 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ca4f9274-a726-4405-8536-ce3862978678 and timestamp: 2021-08-15T00:49:19.452000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.629

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 15, 2021 12:49:20 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 29.242 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 0s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/4kvry2ccdmm4o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2301

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2301/display/redirect>

Changes:


------------------------------------------
[...truncated 347.36 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 14, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 14, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 14, 2021 6:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 14, 2021 6:44:59 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash fd1ddecd0835de2b8638856b75be560e83e7f91a07152e034e63beb8fb85ba77> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-_R3ezQg13iuGOIVrdb5WDoPn-RoHFS4DTmO-uPuFunc.pb
    Aug 14, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 14, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2714091922704269393.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-E2vnspjei9EYvke9ttLkdsXXdRV6Q8A0O8OnKO3P_QM.jar
    Aug 14, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 14, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 14, 2021 6:45:02 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 14, 2021 6:45:02 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 14, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 14, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 14, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 14, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 14, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-14_11_45_03-11437815500845541599?project=apache-beam-testing
    Aug 14, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-14_11_45_03-11437815500845541599
    Aug 14, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-14_11_45_03-11437815500845541599
    Aug 14, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-14T18:45:06.606Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 14, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:12.738Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 14, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:13.338Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 14, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:13.377Z: Expanding GroupByKey operations into optimizable parts.
    Aug 14, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:13.401Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 14, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:13.456Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 14, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:13.489Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 14, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:13.521Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 14, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:13.553Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 14, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:13.876Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:13.965Z: Starting 5 workers in us-central1-c...
    Aug 14, 2021 6:45:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:22.507Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 14, 2021 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:45:57.034Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 14, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:46:22.899Z: Workers have started successfully.
    Aug 14, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:46:22.936Z: Workers have started successfully.
    Aug 14, 2021 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:46:50.803Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2021 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:46:50.912Z: Cleaning up.
    Aug 14, 2021 6:46:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:46:50.964Z: Stopping worker pool...
    Aug 14, 2021 6:49:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:49:13.301Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 14, 2021 6:49:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T18:49:13.344Z: Worker pool stopped.
    Aug 14, 2021 6:49:18 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-14_11_45_03-11437815500845541599 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 0dfab2eb-7a84-4788-8954-301c2952e09a and timestamp: 2021-08-14T18:49:18.765000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.087

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2021 6:49:19 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 32.354 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 1s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/cthfdviiibub6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2300

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2300/display/redirect>

Changes:


------------------------------------------
[...truncated 348.39 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 14, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 14, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 14, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 14, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115846 bytes, hash b9290a45b973b63bb27660320462fc1d1cc4e9430c41da491abe8e8f2be2e29d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-uSkKRblztjuydmAyBGL8HRzE6UMMQdpJGr6Ojyvi4p0.pb
    Aug 14, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 14, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 14, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5775970159894225398.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-wk5yf6CU8JHW3J86Hgiidc2wDtC4BHwnt9eIZ2rJDq0.jar
    Aug 14, 2021 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 14, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 14, 2021 12:45:12 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 14, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 14, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 14, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 14, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 14, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-14_05_45_12-9725367090148752909?project=apache-beam-testing
    Aug 14, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-14_05_45_12-9725367090148752909
    Aug 14, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-14_05_45_12-9725367090148752909
    Aug 14, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-14T12:45:16.175Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 14, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:23.331Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:24.172Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:24.212Z: Expanding GroupByKey operations into optimizable parts.
    Aug 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:24.252Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:24.347Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:24.373Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:24.411Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:24.447Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:24.783Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:24.858Z: Starting 5 workers in us-central1-c...
    Aug 14, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:44.983Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 14, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:56.977Z: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s).
    Aug 14, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:45:57.008Z: Resized worker pool to 3, though goal was 5.  This could be a quota issue.
    Aug 14, 2021 12:46:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:46:07.236Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 14, 2021 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:46:30.638Z: Workers have started successfully.
    Aug 14, 2021 12:46:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:46:30.675Z: Workers have started successfully.
    Aug 14, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:47:02.126Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:47:02.279Z: Cleaning up.
    Aug 14, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:47:02.362Z: Stopping worker pool...
    Aug 14, 2021 12:49:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:49:19.013Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 14, 2021 12:49:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T12:49:19.234Z: Worker pool stopped.
    Aug 14, 2021 12:49:26 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-14_05_45_12-9725367090148752909 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1fc6d483-b7a5-49f3-a1e0-1b7e48efbee1 and timestamp: 2021-08-14T12:49:26.588000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.993

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2021 12:49:27 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 33.344 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 7s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/ieo7q2bega2ji

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2299

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2299/display/redirect>

Changes:


------------------------------------------
[...truncated 348.61 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 14, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 14, 2021 6:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 14, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 14, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 2ed7d4f53901b762510581a1be512431f15ca286c9a0c32716cb3f01796b9861> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-LtfU9TkBt2JRBYGhvlEkMfFcoobJoMMnFss_AXlrmGE.pb
    Aug 14, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 14, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 14, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1760296602170169560.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-cn37HLVFM1hpxQXugr7fYNP8GQTCsN99RkWJlAA1Bk8.jar
    Aug 14, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 14, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 14, 2021 6:45:03 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 14, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 14, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 14, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 14, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 14, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-13_23_45_04-11186255344392086691?project=apache-beam-testing
    Aug 14, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-13_23_45_04-11186255344392086691
    Aug 14, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-13_23_45_04-11186255344392086691
    Aug 14, 2021 6:45:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-14T06:45:07.818Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 14, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:45:14.853Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 14, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:45:15.473Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 14, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:45:15.520Z: Expanding GroupByKey operations into optimizable parts.
    Aug 14, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:45:15.556Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 14, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:45:15.655Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 14, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:45:15.690Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 14, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:45:15.732Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 14, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:45:15.764Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 14, 2021 6:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:45:16.124Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:45:16.214Z: Starting 5 workers in us-central1-c...
    Aug 14, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:45:21.679Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 14, 2021 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:46:00.127Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 14, 2021 6:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:46:25.630Z: Workers have started successfully.
    Aug 14, 2021 6:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:46:25.664Z: Workers have started successfully.
    Aug 14, 2021 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:46:56.636Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2021 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:46:56.802Z: Cleaning up.
    Aug 14, 2021 6:46:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:46:56.885Z: Stopping worker pool...
    Aug 14, 2021 6:49:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:49:14.604Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 14, 2021 6:49:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T06:49:14.656Z: Worker pool stopped.
    Aug 14, 2021 6:49:20 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-13_23_45_04-11186255344392086691 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 18fe7664-f120-42b2-a7b6-ebdfe1699a13 and timestamp: 2021-08-14T06:49:20.938000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.486

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2021 6:49:21 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 32.918 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 2s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/6zqygopos64pu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2298

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2298/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12757]: Remove call for Json Dag in Portable PipelineRunner as

[noreply] fix confusing function def (#15120)


------------------------------------------
[...truncated 347.13 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 14, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 14, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 14, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 14, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 14, 2021 12:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 14, 2021 12:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 14, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 14, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash d566ed99d7c7b4657f201246e8afc57137126f52a091f6a7a115384c6582afec> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-1WbtmdfHtGV_IBJG6K_FcTcSb1KgkfanoRU4TGWCr-w.pb
    Aug 14, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 14, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 14, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6149197251698038484.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-R5Pd5nG7wWTp8cXfy_rblMKyh6cTGfshtNLOs1KpSyU.jar
    Aug 14, 2021 12:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 14, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 14, 2021 12:45:06 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 14, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 14, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 14, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 14, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 14, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-13_17_45_06-11151959740404657107?project=apache-beam-testing
    Aug 14, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-13_17_45_06-11151959740404657107
    Aug 14, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-13_17_45_06-11151959740404657107
    Aug 14, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-14T00:45:10.062Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 14, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:17.745Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 14, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:18.369Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 14, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:18.418Z: Expanding GroupByKey operations into optimizable parts.
    Aug 14, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:18.471Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 14, 2021 12:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:18.556Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 14, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:18.593Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 14, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:18.633Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 14, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:18.675Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 14, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:19.259Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:19.346Z: Starting 5 workers in us-central1-c...
    Aug 14, 2021 12:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:32.443Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 14, 2021 12:45:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:45:55.853Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 14, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:46:21.564Z: Workers have started successfully.
    Aug 14, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:46:21.594Z: Workers have started successfully.
    Aug 14, 2021 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:46:51.225Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 14, 2021 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:46:51.405Z: Cleaning up.
    Aug 14, 2021 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:46:51.525Z: Stopping worker pool...
    Aug 14, 2021 12:49:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:49:09.774Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 14, 2021 12:49:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-14T00:49:09.824Z: Worker pool stopped.
    Aug 14, 2021 12:49:17 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-13_17_45_06-11151959740404657107 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): bc1f24a5-8b3e-4209-965f-796b732694dd and timestamp: 2021-08-14T00:49:17.178000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.197

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 14, 2021 12:49:17 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 27.545 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 58s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/qqzut563vzjrq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2297

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2297/display/redirect>

Changes:


------------------------------------------
[...truncated 346.60 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 13, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 13, 2021 6:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 13, 2021 6:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 13, 2021 6:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash b73cad440641839e3360b7c7406380d962f5070a07c785a6edb1c9ccbe861b67> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tzytRAZBg54zYLfHQGOA2WL1BwoHx4Wm7bHJzL6GG2c.pb
    Aug 13, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 13, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 13, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2521008672497125922.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2wrGIZncAhdekA7tdr9LHGpD4-rC5fU0PE4yGUR9nio.jar
    Aug 13, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 13, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 13, 2021 6:45:04 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 13, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 13, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 13, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 13, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 13, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-13_11_45_05-1599838006857705396?project=apache-beam-testing
    Aug 13, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-13_11_45_05-1599838006857705396
    Aug 13, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-13_11_45_05-1599838006857705396
    Aug 13, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-13T18:45:08.499Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 13, 2021 6:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:45:16.549Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 13, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:45:17.227Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 13, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:45:17.265Z: Expanding GroupByKey operations into optimizable parts.
    Aug 13, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:45:17.317Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 13, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:45:17.387Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 13, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:45:17.437Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 13, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:45:17.480Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 13, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:45:17.526Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 13, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:45:17.867Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2021 6:45:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:45:17.976Z: Starting 5 workers in us-central1-c...
    Aug 13, 2021 6:45:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:45:30.739Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 13, 2021 6:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:46:02.554Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 13, 2021 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:46:29.576Z: Workers have started successfully.
    Aug 13, 2021 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:46:29.613Z: Workers have started successfully.
    Aug 13, 2021 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:47:07.226Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2021 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:47:07.377Z: Cleaning up.
    Aug 13, 2021 6:47:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:47:07.470Z: Stopping worker pool...
    Aug 13, 2021 6:49:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:49:26.391Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 13, 2021 6:49:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T18:49:26.437Z: Worker pool stopped.
    Aug 13, 2021 6:49:32 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-13_11_45_05-1599838006857705396 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e2223f5b-f67c-4540-a911-23688c5361ec and timestamp: 2021-08-13T18:49:32.405000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     13.44

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2021 6:49:32 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 44.493 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 14s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/cw45c2jwjwp5u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2296

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2296/display/redirect>

Changes:


------------------------------------------
[...truncated 350.14 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1286105505]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2021 12:47:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2021 12:47:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2021 12:47:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2021 12:47:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2021 12:47:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2021 12:47:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2021 12:47:05 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 13, 2021 12:47:05 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 13, 2021 12:47:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 13, 2021 12:47:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 13, 2021 12:47:14 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115851 bytes, hash e7f74e89f7f5d6f733b6120d70eb80da57ac45f3a4bcbc771b917b7e37dbdce9> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-5_dOiff11vczthINcOuA2lesRfOkvLx3G5F7fjfb3Ok.pb
    Aug 13, 2021 12:47:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 13, 2021 12:47:17 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 13, 2021 12:47:17 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4528212258275925498.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-9wldAPOls8lRRGodLyUDbZ134HUk1wZZTdVZeE4TkcQ.jar
    Aug 13, 2021 12:47:17 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 13, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 13, 2021 12:47:18 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 13, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 13, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 13, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 13, 2021 12:47:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 13, 2021 12:47:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-13_05_47_18-4828635293535640960?project=apache-beam-testing
    Aug 13, 2021 12:47:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-13_05_47_18-4828635293535640960
    Aug 13, 2021 12:47:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-13_05_47_18-4828635293535640960
    Aug 13, 2021 12:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-13T12:47:23.980Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 13, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:47:39.377Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 13, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:47:40.156Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 13, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:47:40.202Z: Expanding GroupByKey operations into optimizable parts.
    Aug 13, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:47:40.245Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 13, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:47:40.429Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 13, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:47:40.547Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 13, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:47:40.642Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 13, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:47:40.712Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 13, 2021 12:47:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:47:41.238Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2021 12:47:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:47:41.313Z: Starting 5 workers in us-central1-c...
    Aug 13, 2021 12:47:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:47:50.875Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 13, 2021 12:48:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:48:25.442Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 13, 2021 12:48:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:48:52.073Z: Workers have started successfully.
    Aug 13, 2021 12:48:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:48:52.105Z: Workers have started successfully.
    Aug 13, 2021 12:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:49:22.012Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2021 12:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:49:22.262Z: Cleaning up.
    Aug 13, 2021 12:49:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:49:22.374Z: Stopping worker pool...
    Aug 13, 2021 12:51:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:51:54.887Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 13, 2021 12:51:55 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T12:51:54.945Z: Worker pool stopped.
    Aug 13, 2021 12:53:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-13_05_47_18-4828635293535640960 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 62ee6029-d679-4985-8609-b9a02412cfee and timestamp: 2021-08-13T12:53:15.371000000Z:
                     Metric:                    Value:
                   read_time                     8.767
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2021 12:53:15 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 6 mins 25.012 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 38s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/427bley3a5zio

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2295

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2295/display/redirect>

Changes:


------------------------------------------
[...truncated 347.87 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 13, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 13, 2021 6:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 13, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 13, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 28609eacb6284ea23bde7bdfc05aa151f833f9a189fcf0a83f96702a118f11f0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KGCerLYoTqI73nvfwFqhUfgz-aGJ_PCoP5ZwKhGPEfA.pb
    Aug 13, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 13, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4811381956572785938.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-WauvKwAPyWhPPMPhmsnh5RytPFTmozu8wBi3QQROqhA.jar
    Aug 13, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 13, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.33.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.33.0-SNAPSHOT-tests-ir07asFiuWsB6EYUMDtP2vRUrUgxBz8EzmvBKSogYm0.jar
    Aug 13, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.33.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.33.0-SNAPSHOT-xla6F42QKxVB1aLg_hKfFC9DL-XPmimO2kwgZF8GV0E.jar
    Aug 13, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 245 files cached, 3 files newly uploaded in 0 seconds
    Aug 13, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 13, 2021 6:45:06 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 13, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 13, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 13, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 13, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 13, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-12_23_45_06-11019022182449848254?project=apache-beam-testing
    Aug 13, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-12_23_45_06-11019022182449848254
    Aug 13, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-12_23_45_06-11019022182449848254
    Aug 13, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-13T06:45:10.032Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 13, 2021 6:45:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:45:24.047Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 13, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:45:24.808Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 13, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:45:24.845Z: Expanding GroupByKey operations into optimizable parts.
    Aug 13, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:45:24.872Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 13, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:45:24.944Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 13, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:45:24.982Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 13, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:45:25.021Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 13, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:45:25.054Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 13, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:45:25.384Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:45:25.471Z: Starting 5 workers in us-central1-c...
    Aug 13, 2021 6:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:45:57.607Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 13, 2021 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:46:10.833Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 13, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:46:35.624Z: Workers have started successfully.
    Aug 13, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:46:35.650Z: Workers have started successfully.
    Aug 13, 2021 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:47:05.037Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2021 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:47:05.177Z: Cleaning up.
    Aug 13, 2021 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:47:05.266Z: Stopping worker pool...
    Aug 13, 2021 6:49:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:49:21.695Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 13, 2021 6:49:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T06:49:21.751Z: Worker pool stopped.
    Aug 13, 2021 6:49:29 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-12_23_45_06-11019022182449848254 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 27320864-6fb3-45e8-a4dd-574530a78ad4 and timestamp: 2021-08-13T06:49:29.110000000Z:
                     Metric:                    Value:
                   read_time                     8.811
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2021 6:49:29 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 39.449 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 9s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/c7ldwpk5xwu6q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2294

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2294/display/redirect?page=changes>

Changes:

[noreply] Loosen typing extensions requirement

[Kyle Weaver] [BEAM-12719] ZetaSQL docs: replace out of date table with link to

[noreply] [BEAM-12453]: Add interface to access I/O topic information for a Samza

[noreply] [BEAM-10212] Guard state caching with experiment (#15319)


------------------------------------------
[...truncated 360.00 KB...]
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2021 1:10:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2021 1:10:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2021 1:10:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 13, 2021 1:10:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 13, 2021 1:10:13 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 13, 2021 1:10:13 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 13, 2021 1:10:14 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 13, 2021 1:10:14 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 13, 2021 1:10:15 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 13, 2021 1:10:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 13, 2021 1:10:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115846 bytes, hash f440dfae5d7f1504206ee1d157f2151f3cf9de8d72aa09ff1ad9d2b02ad86377> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9EDfrl1_FQQgbuHRV_IVHzz53o1yqgn_GtnSsCrYY3c.pb
    Aug 13, 2021 1:10:36 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 13, 2021 1:10:36 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 13, 2021 1:10:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-yiqRA0u3xlC3Kk9E1r7d4NTDO5HL5Z0dcibMgeYMmBU.jar
    Aug 13, 2021 1:10:39 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7373398332588075530.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-qQBuIWMfpsA1R27hl4gE2AEvHyl_I1oV_nLtn6472QM.jar
    Aug 13, 2021 1:10:43 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 2 files newly uploaded in 6 seconds
    Aug 13, 2021 1:10:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 13, 2021 1:10:43 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 13, 2021 1:10:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 13, 2021 1:10:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 13, 2021 1:10:44 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 13, 2021 1:10:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 13, 2021 1:10:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-12_18_10_44-2024745818295088582?project=apache-beam-testing
    Aug 13, 2021 1:10:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-12_18_10_44-2024745818295088582
    Aug 13, 2021 1:10:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-12_18_10_44-2024745818295088582
    Aug 13, 2021 1:10:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-13T01:10:49.966Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 13, 2021 1:10:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:10:56.647Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 13, 2021 1:10:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:10:57.346Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 13, 2021 1:10:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:10:57.385Z: Expanding GroupByKey operations into optimizable parts.
    Aug 13, 2021 1:10:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:10:57.408Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 13, 2021 1:10:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:10:57.479Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 13, 2021 1:10:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:10:57.508Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 13, 2021 1:10:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:10:57.536Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 13, 2021 1:10:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:10:57.559Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 13, 2021 1:10:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:10:57.903Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2021 1:10:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:10:57.980Z: Starting 5 workers in us-central1-c...
    Aug 13, 2021 1:11:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:11:09.820Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 13, 2021 1:11:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:11:40.751Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 13, 2021 1:12:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:12:08.267Z: Workers have started successfully.
    Aug 13, 2021 1:12:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:12:08.300Z: Workers have started successfully.
    Aug 13, 2021 1:12:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:12:39.396Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 13, 2021 1:12:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:12:39.601Z: Cleaning up.
    Aug 13, 2021 1:12:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:12:39.725Z: Stopping worker pool...
    Aug 13, 2021 1:14:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:14:58.047Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 13, 2021 1:14:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-13T01:14:58.098Z: Worker pool stopped.
    Aug 13, 2021 1:15:04 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-12_18_10_44-2024745818295088582 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): efee940a-78d7-4c66-aa0f-7430d1fe2d43 and timestamp: 2021-08-13T01:15:04.623000000Z:
                     Metric:                    Value:
                   read_time                     10.57
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 13, 2021 1:15:05 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.114 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.113 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 5 mins 34.625 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 16s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/x25zdqjm5j7qw

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2293

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2293/display/redirect?page=changes>

Changes:

[bsindhwani] Python Kata - Event Time Triggers

[bsindhwani] Python Kata - Event Time Triggers - Added license to task-info.yaml

[bsindhwani] Python Kata - Event Time Triggers - refactored the accumulation mode

[bsindhwani] Python Kata - Event Time Triggers - task description corrected to

[bsindhwani] Python Kata - Event Time Triggers - added streaming options

[bsindhwani] Python Kata - Event Time Triggers - add license text in task-info.yaml

[bsindhwani] Python Kata - Early Triggers

[bsindhwani] Python Kata - Event Time Triggers & Early Triggers - task.md - corrected

[bsindhwani] Python Kata - Window Accumulation Mode

[bsindhwani] Python Kata - Event Time Triggers - removed the external transform

[bsindhwani] Python Kata - added license to course-info.yaml

[Robert Burke] [BEAM-11779] Remove appliance restriction.

[ryanthompson591] Add retry_gcs_file_copy when 500 errros happen from server

[noreply] [BEAM-6374] Emit PCollection metrics from GoSDK (#15289)

[noreply] Merge pull request #15165 from [BEAM-12593] Verify DataFrame API on

[noreply] Merge pull request #15265 from [BEAM-12545] Extend FileSink by


------------------------------------------
[...truncated 347.46 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 12, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 12, 2021 6:44:59 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 12, 2021 6:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 12, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115907 bytes, hash 8c367f7a81359593836e1edfe9246e1c104c8bce5db9b2817ce5818bdfdad93d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jDZ_eoE1lZODbh7f6SRuHBBMi85dubKBfOWBi9_a2T0.pb
    Aug 12, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 12, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 12, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6103819292535727591.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-v21FplEtqtuxZtqmh3Pr_MpuT8pu6nMNBSfN4Lg-NmI.jar
    Aug 12, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 12, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 12, 2021 6:45:05 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 12, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 12, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 12, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 12, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 12, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-12_11_45_05-833717377603007966?project=apache-beam-testing
    Aug 12, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-12_11_45_05-833717377603007966
    Aug 12, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-12_11_45_05-833717377603007966
    Aug 12, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-12T18:45:29.258Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 12, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:45:36.757Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 12, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:45:37.457Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 12, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:45:37.485Z: Expanding GroupByKey operations into optimizable parts.
    Aug 12, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:45:37.516Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 12, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:45:37.601Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 12, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:45:37.639Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 12, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:45:37.669Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 12, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:45:37.705Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 12, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:45:38.082Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2021 6:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:45:38.156Z: Starting 5 workers in us-central1-b...
    Aug 12, 2021 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:45:43.145Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 12, 2021 6:46:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:46:35.012Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 12, 2021 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:46:59.788Z: Workers have started successfully.
    Aug 12, 2021 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:46:59.817Z: Workers have started successfully.
    Aug 12, 2021 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:47:32.892Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2021 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:47:33.009Z: Cleaning up.
    Aug 12, 2021 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:47:33.091Z: Stopping worker pool...
    Aug 12, 2021 6:49:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:49:58.676Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 12, 2021 6:49:59 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T18:49:58.716Z: Worker pool stopped.
    Aug 12, 2021 6:50:04 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-12_11_45_05-833717377603007966 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 7592e01d-6659-4dd0-ba4c-cf92c90720be and timestamp: 2021-08-12T18:50:04.589000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.176

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2021 6:50:05 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 5 mins 15.916 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 45s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/cob5hgj5vjfti

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2292

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2292/display/redirect>

Changes:


------------------------------------------
[...truncated 347.37 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 12, 2021 12:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 12, 2021 12:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 12, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 12, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115910 bytes, hash c2670bf7d3b997f9e663c461d2c239cff4faabb8bfc85f23ac0c89d9c5ad4326> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wmcL99O5l_nmY8Rh0sI5z_T6q7i_yF8jrAyJ2cWtQyY.pb
    Aug 12, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 12, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 12, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5159549922894911318.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XNP7phapJiew2Q3muzJ--Itrh9PrP0O7AuejlCIa_3w.jar
    Aug 12, 2021 12:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 12, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 12, 2021 12:45:04 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 12, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 12, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 12, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 12, 2021 12:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 12, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-12_05_45_04-9906366813276618176?project=apache-beam-testing
    Aug 12, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-12_05_45_04-9906366813276618176
    Aug 12, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-12_05_45_04-9906366813276618176
    Aug 12, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-12T12:45:18.807Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 12, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:45:23.158Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 12, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:45:23.887Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 12, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:45:23.929Z: Expanding GroupByKey operations into optimizable parts.
    Aug 12, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:45:23.957Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 12, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:45:24.012Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 12, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:45:24.032Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 12, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:45:24.058Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 12, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:45:24.096Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 12, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:45:24.462Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2021 12:45:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:45:24.531Z: Starting 5 workers in us-central1-a...
    Aug 12, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:45:40.373Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 12, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:46:13.859Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 12, 2021 12:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:46:39.881Z: Workers have started successfully.
    Aug 12, 2021 12:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:46:39.905Z: Workers have started successfully.
    Aug 12, 2021 12:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:47:08.587Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2021 12:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:47:08.727Z: Cleaning up.
    Aug 12, 2021 12:47:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:47:08.809Z: Stopping worker pool...
    Aug 12, 2021 12:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:49:22.343Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 12, 2021 12:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T12:49:22.373Z: Worker pool stopped.
    Aug 12, 2021 12:49:28 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-12_05_45_04-9906366813276618176 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 032298fb-9dea-4f0d-a1bb-65fac04af504 and timestamp: 2021-08-12T12:49:29.007000000Z:
                     Metric:                    Value:
                   read_time                      7.13
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2021 12:49:29 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 40.848 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 9s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/pepy4spdbwjc4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2291

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2291/display/redirect?page=changes>

Changes:

[noreply] [GoSDK Infra] Limit simultaneous tests binaries to 3. (#15321)


------------------------------------------
[...truncated 348.04 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2021 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2021 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2021 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2021 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2021 6:45:17 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2021 6:45:17 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 12, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 12, 2021 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 12, 2021 6:45:24 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 12, 2021 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115846 bytes, hash 617e9f9ebf01cb7571876126da6121afc202c70edf7cc09c8caa9270d6650cef> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-YX6fnr8By3Vxh2Em2mEhr8ICxw7ffMCcjKqScNZlDO8.pb
    Aug 12, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 12, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 12, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5581076700212787958.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-gGrRlT0j8rP_VPG0ss4Xq7npH0-Dq0s1CcKe_L_u2LU.jar
    Aug 12, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 12, 2021 6:45:27 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 12, 2021 6:45:27 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 12, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 12, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 12, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 12, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 12, 2021 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-11_23_45_28-15064017972578989690?project=apache-beam-testing
    Aug 12, 2021 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-11_23_45_28-15064017972578989690
    Aug 12, 2021 6:45:30 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-11_23_45_28-15064017972578989690
    Aug 12, 2021 6:45:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-12T06:45:33.005Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 12, 2021 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:45:39.107Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 12, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:45:39.917Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 12, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:45:39.941Z: Expanding GroupByKey operations into optimizable parts.
    Aug 12, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:45:39.970Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 12, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:45:40.049Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 12, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:45:40.069Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 12, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:45:40.098Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 12, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:45:40.127Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 12, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:45:40.492Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2021 6:45:42 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:45:40.562Z: Starting 5 workers in us-central1-a...
    Aug 12, 2021 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:46:07.551Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 12, 2021 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:46:10.432Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 12, 2021 6:46:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:46:10.467Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 12, 2021 6:46:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:46:20.731Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 12, 2021 6:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:46:45.118Z: Workers have started successfully.
    Aug 12, 2021 6:46:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:46:45.156Z: Workers have started successfully.
    Aug 12, 2021 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:47:13.774Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2021 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:47:13.911Z: Cleaning up.
    Aug 12, 2021 6:47:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:47:13.980Z: Stopping worker pool...
    Aug 12, 2021 6:49:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:49:31.663Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 12, 2021 6:49:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T06:49:31.714Z: Worker pool stopped.
    Aug 12, 2021 6:49:37 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-11_23_45_28-15064017972578989690 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8463701f-3c27-41aa-88a3-736a769005d4 and timestamp: 2021-08-12T06:49:37.298000000Z:
                     Metric:                    Value:
                   read_time                      6.96
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2021 6:49:37 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.039 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 31.869 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 15s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/45y5fjvyilkfc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2290

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2290/display/redirect?page=changes>

Changes:

[marwant] Move misplaced cloud healthcare update info line from template to

[Robert Burke] [BEAM-12738][GoSDK] Fix Dataflow Job URL

[eugene.nikolayev] Fix docs typos in contrib guide.

[noreply] Merge pull request #15126 from [BEAM-12665] Add option in ReadAll

[noreply] Update ptransform.py error message (#15286)


------------------------------------------
[...truncated 348.48 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 12, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 12, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 12, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 12, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 12, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 12, 2021 12:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 12, 2021 12:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 12, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 7b90c7de693c488ab89f1321465fe80ed14dc576f90ab376474017c8711f40ae> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-e5DH3mk8SIq4nxMhRl_oDtFNxXb5CrN2R0AXyHEfQK4.pb
    Aug 12, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 12, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 12, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test703950539460869924.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-wL64GOMMuCymSnHCr6ziFeS3mEKprxA7jILn2z_C260.jar
    Aug 12, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 12, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 12, 2021 12:45:04 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 12, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 12, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 12, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 12, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 12, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-11_17_45_04-2082918482093420055?project=apache-beam-testing
    Aug 12, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-11_17_45_04-2082918482093420055
    Aug 12, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-11_17_45_04-2082918482093420055
    Aug 12, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-12T00:45:07.996Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 12, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:13.844Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 12, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:14.692Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 12, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:14.746Z: Expanding GroupByKey operations into optimizable parts.
    Aug 12, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:14.773Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 12, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:14.866Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 12, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:14.897Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 12, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:14.929Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 12, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:14.973Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 12, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:15.367Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:15.450Z: Starting 5 workers in us-central1-c...
    Aug 12, 2021 12:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:39.147Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 12, 2021 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:45:54.792Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 12, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:46:21.533Z: Workers have started successfully.
    Aug 12, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:46:21.574Z: Workers have started successfully.
    Aug 12, 2021 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:46:52.043Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 12, 2021 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:46:52.222Z: Cleaning up.
    Aug 12, 2021 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:46:52.311Z: Stopping worker pool...
    Aug 12, 2021 12:49:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:49:07.347Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 12, 2021 12:49:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-12T00:49:07.414Z: Worker pool stopped.
    Aug 12, 2021 12:49:18 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-11_17_45_04-2082918482093420055 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): de1d6096-bd03-455c-9711-6c3ba390052d and timestamp: 2021-08-12T00:49:18.164000000Z:
                     Metric:                    Value:
                   read_time                      9.91
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 12, 2021 12:49:18 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 30.58 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 0s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/53ekbtx6zidze

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2289

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2289/display/redirect>

Changes:


------------------------------------------
[...truncated 347.38 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 11, 2021 6:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 11, 2021 6:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 11, 2021 6:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 11, 2021 6:45:02 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 703d4d12d970fcfce7d1635fd4e63109f3c202393cf4b4ffff768b4f73f2b536> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cD1NEtlw_Pzn0WNf1OYxCfPCAjk89LT__3aLT3PytTY.pb
    Aug 11, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 11, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 11, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7663391599320107006.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-GuHl7pIsnJ6DatXH3fI9lsLK-yyuyssNZWciRgCtrw4.jar
    Aug 11, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 11, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 11, 2021 6:45:05 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 11, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 11, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 11, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 11, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 11, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-11_11_45_06-14989026027848510153?project=apache-beam-testing
    Aug 11, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-11_11_45_06-14989026027848510153
    Aug 11, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-11_11_45_06-14989026027848510153
    Aug 11, 2021 6:45:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-11T18:45:09.905Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 11, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:45:16.372Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 11, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:45:17.052Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 11, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:45:17.103Z: Expanding GroupByKey operations into optimizable parts.
    Aug 11, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:45:17.127Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 11, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:45:17.212Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 11, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:45:17.243Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 11, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:45:17.279Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 11, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:45:17.314Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 11, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:45:17.819Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2021 6:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:45:17.920Z: Starting 5 workers in us-central1-b...
    Aug 11, 2021 6:45:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:45:48.054Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 11, 2021 6:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:46:00.210Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 11, 2021 6:46:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:46:00.244Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 11, 2021 6:46:12 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:46:10.676Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 11, 2021 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:46:35.310Z: Workers have started successfully.
    Aug 11, 2021 6:46:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:46:35.345Z: Workers have started successfully.
    Aug 11, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:47:06.363Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:47:06.558Z: Cleaning up.
    Aug 11, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:47:06.663Z: Stopping worker pool...
    Aug 11, 2021 6:49:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:49:33.032Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 11, 2021 6:49:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T18:49:33.075Z: Worker pool stopped.
    Aug 11, 2021 6:49:38 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-11_11_45_06-14989026027848510153 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): ccd4a958-7e92-48cd-a1c3-7316671db5be and timestamp: 2021-08-11T18:49:38.930000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.632

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2021 6:49:39 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 49.943 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 20s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/pfmkxb7xn6wdg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2288

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2288/display/redirect>

Changes:


------------------------------------------
[...truncated 358.21 KB...]
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2021 12:46:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2021 12:46:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2021 12:46:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2021 12:46:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2021 12:46:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2021 12:46:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2021 12:46:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 11, 2021 12:46:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 11, 2021 12:46:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 11, 2021 12:46:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 11, 2021 12:46:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115909 bytes, hash 5f023c2820d50e72c6d3576fbf2be0eb2e862067c5e06c89b1adbf40e409938d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-XwI8KCDVDnLG01dvvyvg6y6GIGfF4GyJsa2_QOQJk40.pb
    Aug 11, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 11, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5587599285382013642.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-YV3mFkTuCH7Fz-JRgL1u1ayOqMyZyn3yPcA_zKhwZCg.jar
    Aug 11, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 11, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.mongodb/mongo-java-driver/3.12.7/1f45c6a397feeb46da75425619333d1cc6f90f78/mongo-java-driver-3.12.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/mongo-java-driver-3.12.7-D_zgBJWNb9mzmuetJ37a0X9XtpcfSGsXYpxe6eE8Tao.jar
    Aug 11, 2021 12:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 2 files newly uploaded in 0 seconds
    Aug 11, 2021 12:46:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 11, 2021 12:46:25 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 11, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 11, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 11, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 11, 2021 12:46:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 11, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-11_05_46_26-6458339860388639475?project=apache-beam-testing
    Aug 11, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-11_05_46_26-6458339860388639475
    Aug 11, 2021 12:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-11_05_46_26-6458339860388639475
    Aug 11, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-11T12:46:29.742Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 11, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:46:37.139Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 11, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:46:37.896Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 11, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:46:37.936Z: Expanding GroupByKey operations into optimizable parts.
    Aug 11, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:46:37.971Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 11, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:46:38.060Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 11, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:46:38.101Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 11, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:46:38.157Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 11, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:46:38.184Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 11, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:46:38.569Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2021 12:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:46:38.661Z: Starting 5 workers in us-central1-c...
    Aug 11, 2021 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:46:44.356Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 11, 2021 12:47:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:47:19.130Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 11, 2021 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:47:46.945Z: Workers have started successfully.
    Aug 11, 2021 12:47:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:47:46.976Z: Workers have started successfully.
    Aug 11, 2021 12:48:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:48:16.827Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2021 12:48:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:48:17.014Z: Cleaning up.
    Aug 11, 2021 12:48:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:48:17.112Z: Stopping worker pool...
    Aug 11, 2021 12:50:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:50:44.021Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 11, 2021 12:50:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T12:50:44.079Z: Worker pool stopped.
    Aug 11, 2021 12:50:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-11_05_46_26-6458339860388639475 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): fb0d3fd3-2b15-490e-bf8f-a8c01bf0b1f5 and timestamp: 2021-08-11T12:50:53.271000000Z:
                     Metric:                    Value:
                   read_time                     9.063
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2021 12:50:53 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 43.979 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 33s
152 actionable tasks: 104 executed, 48 from cache

Publishing build scan...
https://gradle.com/s/75nyueaw3os7o

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2287

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2287/display/redirect>

Changes:


------------------------------------------
[...truncated 348.00 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2021 6:45:04 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 11, 2021 6:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 11, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 11, 2021 6:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 11, 2021 6:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash e4cf346129b824f25f076805b8d34a8ea6b507dcbb3059e36fbf1b8913dcce3d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-5M80YSm4JPJfB2gFuNNKjqa1B9y7MFnjb78biRPczj0.pb
    Aug 11, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 11, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 11, 2021 6:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8722539986529901259.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Ueq0HJ1XGDwwDFSSd7grp3YeYHFKbThn-20yDUBsB0Y.jar
    Aug 11, 2021 6:45:12 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 11, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 11, 2021 6:45:12 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 11, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 11, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 11, 2021 6:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 11, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 11, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-10_23_45_13-12297051595530735808?project=apache-beam-testing
    Aug 11, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-10_23_45_13-12297051595530735808
    Aug 11, 2021 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-10_23_45_13-12297051595530735808
    Aug 11, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-11T06:45:16.632Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 11, 2021 6:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:23.512Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 11, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:24.242Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 11, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:24.285Z: Expanding GroupByKey operations into optimizable parts.
    Aug 11, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:24.323Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 11, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:24.409Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 11, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:24.447Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 11, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:24.484Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 11, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:24.516Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 11, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:24.933Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2021 6:45:26 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:25.019Z: Starting 5 workers in us-central1-c...
    Aug 11, 2021 6:45:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:50.880Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 11, 2021 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:55.302Z: Autoscaling: Raised the number of workers to 2 based on the rate of progress in the currently running stage(s).
    Aug 11, 2021 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:45:55.330Z: Resized worker pool to 2, though goal was 5.  This could be a quota issue.
    Aug 11, 2021 6:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:46:05.545Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 11, 2021 6:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:46:30.519Z: Workers have started successfully.
    Aug 11, 2021 6:46:30 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:46:30.554Z: Workers have started successfully.
    Aug 11, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:47:00.559Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:47:00.767Z: Cleaning up.
    Aug 11, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:47:00.889Z: Stopping worker pool...
    Aug 11, 2021 6:49:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:49:19.058Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 11, 2021 6:49:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T06:49:19.109Z: Worker pool stopped.
    Aug 11, 2021 6:49:25 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-10_23_45_13-12297051595530735808 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): bca61b79-18bd-4702-83a7-3e3c28074bf4 and timestamp: 2021-08-11T06:49:25.986000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.292

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2021 6:49:26 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.036 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 32.228 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 7s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/y2wxkh4q5f47o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2286

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2286/display/redirect?page=changes>

Changes:

[andyxu] Add support for python sdk container to turn on google cloud profiler

[andyxu] format file

[marwant] Update Google Cloud Healthcare API version from v1beta1 to GA

[zyichi] Update beam dataflow python container versions.


------------------------------------------
[...truncated 369.31 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 11, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 11, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 11, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 11, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 11, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 11, 2021 12:47:34 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 11, 2021 12:47:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 11, 2021 12:47:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115847 bytes, hash c2d423aca2398ac4cf19a3ff14087a1fcf07b6a75a8e9f5d674062d35a0f7944> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wtQjrKI5isTPGaP_FAh6H88Htqdajp9dZ0Bi01oPeUQ.pb
    Aug 11, 2021 12:47:40 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 11, 2021 12:47:40 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 11, 2021 12:47:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8340152618941482265.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-0SMJq_NEMXlSY-bcL5cmZZvVtvoMvFYOqOkrhAeg6jM.jar
    Aug 11, 2021 12:47:41 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 1 seconds
    Aug 11, 2021 12:47:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 11, 2021 12:47:41 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 11, 2021 12:47:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 11, 2021 12:47:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 11, 2021 12:47:41 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 11, 2021 12:47:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 11, 2021 12:47:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-10_17_47_42-2296581678762886941?project=apache-beam-testing
    Aug 11, 2021 12:47:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-10_17_47_42-2296581678762886941
    Aug 11, 2021 12:47:42 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-10_17_47_42-2296581678762886941
    Aug 11, 2021 12:47:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-11T00:47:46.212Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 11, 2021 12:47:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:47:51.126Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 11, 2021 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:47:51.786Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 11, 2021 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:47:51.827Z: Expanding GroupByKey operations into optimizable parts.
    Aug 11, 2021 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:47:51.853Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 11, 2021 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:47:51.923Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 11, 2021 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:47:51.960Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 11, 2021 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:47:51.988Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 11, 2021 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:47:52.023Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 11, 2021 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:47:52.359Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2021 12:47:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:47:52.441Z: Starting 5 workers in us-central1-c...
    Aug 11, 2021 12:48:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:48:03.498Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 11, 2021 12:48:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:48:21.783Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 11, 2021 12:48:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:48:21.821Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 11, 2021 12:48:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:48:32.035Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 11, 2021 12:48:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:48:58.165Z: Workers have started successfully.
    Aug 11, 2021 12:48:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:48:58.195Z: Workers have started successfully.
    Aug 11, 2021 12:49:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:49:29.783Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 11, 2021 12:49:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:49:29.967Z: Cleaning up.
    Aug 11, 2021 12:49:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:49:30.058Z: Stopping worker pool...
    Aug 11, 2021 12:51:47 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:51:47.804Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 11, 2021 12:51:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-11T00:51:47.861Z: Worker pool stopped.
    Aug 11, 2021 12:51:55 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-10_17_47_42-2296581678762886941 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6084cc5d-fa66-4a54-831b-a94f3ecb40b2 and timestamp: 2021-08-11T00:51:55.876000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.943

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 11, 2021 12:51:56 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 6 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.103 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 31.954 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 36s
152 actionable tasks: 112 executed, 40 from cache

Publishing build scan...
https://gradle.com/s/5uj3apzpct3mm

Stopped 5 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2285

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2285/display/redirect?page=changes>

Changes:

[chamikaramj] Fix typo

[noreply] [BEAM-12403][BEAM-12348] Add User State Support for Portable SamzaRunner


------------------------------------------
[...truncated 347.76 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@686319773]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2021 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2021 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2021 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2021 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2021 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2021 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2021 6:45:20 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 10, 2021 6:45:20 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 10, 2021 6:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 10, 2021 6:45:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 10, 2021 6:45:27 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115861 bytes, hash 001042a4ad2bf593ac267226f2da133118c4f5dd9b6494ce5ca640a4f0588e0d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ABBCpK0r9ZOsJnIm8toTMRjE9d2bZJTOXKZApPBYjg0.pb
    Aug 10, 2021 6:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 10, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 10, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5027400550484885017.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-OZursV7ZAtxmxHgNQ5Y-26VqGMhXtIz5kooD_aHrHoU.jar
    Aug 10, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 10, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 10, 2021 6:45:30 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 10, 2021 6:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 10, 2021 6:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 10, 2021 6:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 10, 2021 6:45:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 10, 2021 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-10_11_45_31-15842719607510983386?project=apache-beam-testing
    Aug 10, 2021 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-10_11_45_31-15842719607510983386
    Aug 10, 2021 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-10_11_45_31-15842719607510983386
    Aug 10, 2021 6:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-10T18:45:34.772Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 10, 2021 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:45:39.998Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 10, 2021 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:45:40.762Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 10, 2021 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:45:40.804Z: Expanding GroupByKey operations into optimizable parts.
    Aug 10, 2021 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:45:40.848Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 10, 2021 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:45:40.930Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 10, 2021 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:45:40.961Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 10, 2021 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:45:40.988Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 10, 2021 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:45:41.041Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 10, 2021 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:45:41.454Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2021 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:45:41.545Z: Starting 5 workers in us-central1-a...
    Aug 10, 2021 6:45:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:45:49.378Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 10, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:46:32.430Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 10, 2021 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:46:58.551Z: Workers have started successfully.
    Aug 10, 2021 6:47:00 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:46:58.582Z: Workers have started successfully.
    Aug 10, 2021 6:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:47:28.274Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2021 6:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:47:28.425Z: Cleaning up.
    Aug 10, 2021 6:47:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:47:28.534Z: Stopping worker pool...
    Aug 10, 2021 6:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:49:49.324Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 10, 2021 6:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T18:49:49.368Z: Worker pool stopped.
    Aug 10, 2021 6:49:58 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-10_11_45_31-15842719607510983386 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b70d1f5d-103c-4e4a-891a-0902d2058f11 and timestamp: 2021-08-10T18:49:58.671000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.813

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2021 6:49:59 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 55.471 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 37s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/u7i3w7bv4iltk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2284

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2284/display/redirect>

Changes:


------------------------------------------
[...truncated 347.13 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 10, 2021 12:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 10, 2021 12:44:56 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 10, 2021 12:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 10, 2021 12:44:58 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115923 bytes, hash a088781dc4287fa3650de4f994b8a5def19a8639e8d933fc0f5cc0a963305fea> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-oIh4HcQof6NlDeT5lLil3vGahjno2TP8D1zAqWMwX-o.pb
    Aug 10, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 10, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 10, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2575542688212577528.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Ji4afVt5b57bnFtZIpdUbiTswRYUHJG2oocBXRsn78g.jar
    Aug 10, 2021 12:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 10, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 10, 2021 12:45:01 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 10, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 10, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 10, 2021 12:45:01 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 10, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 10, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-10_05_45_02-6356522589584892607?project=apache-beam-testing
    Aug 10, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-10_05_45_02-6356522589584892607
    Aug 10, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-10_05_45_02-6356522589584892607
    Aug 10, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-10T12:45:05.652Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 10, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:12.921Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 10, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:13.727Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 10, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:13.769Z: Expanding GroupByKey operations into optimizable parts.
    Aug 10, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:13.805Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 10, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:13.884Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 10, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:13.911Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 10, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:13.951Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 10, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:13.990Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 10, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:14.479Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:14.555Z: Starting 5 workers in us-central1-c...
    Aug 10, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:27.177Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 10, 2021 12:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:45:54.259Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 10, 2021 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:46:21.947Z: Workers have started successfully.
    Aug 10, 2021 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:46:21.976Z: Workers have started successfully.
    Aug 10, 2021 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:46:53.482Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2021 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:46:53.680Z: Cleaning up.
    Aug 10, 2021 12:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:46:53.795Z: Stopping worker pool...
    Aug 10, 2021 12:49:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:49:19.437Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 10, 2021 12:49:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T12:49:19.482Z: Worker pool stopped.
    Aug 10, 2021 12:49:25 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-10_05_45_02-6356522589584892607 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 8571898a-a7e4-484d-a730-580cdf0a824b and timestamp: 2021-08-10T12:49:25.338000000Z:
                     Metric:                    Value:
                   read_time                     9.929
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2021 12:49:25 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 39.194 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 7s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/pit66j4h2hkdy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2283

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2283/display/redirect>

Changes:


------------------------------------------
[...truncated 353.93 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2021 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2021 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2021 6:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2021 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2021 6:45:20 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2021 6:45:20 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 10, 2021 6:45:21 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 10, 2021 6:45:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 10, 2021 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 10, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash 275390a572b35dd5bb4a6f9e69d08dbb180f33b575c9a12b690fc954d751f2ab> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-J1OQpXKzXdW7Sm-eadCNuxgPM7V1yaEraQ_JVNdR8qs.pb
    Aug 10, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 10, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 10, 2021 6:45:27 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7287112593979012275.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-FwJ8P6nwhq9eBN7pf3xCA4451kProN5o_3D5ndlIR20.jar
    Aug 10, 2021 6:45:28 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 10, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 10, 2021 6:45:28 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 10, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 10, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 10, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 10, 2021 6:45:28 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 10, 2021 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-09_23_45_28-12842127209114945689?project=apache-beam-testing
    Aug 10, 2021 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-09_23_45_28-12842127209114945689
    Aug 10, 2021 6:45:29 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-09_23_45_28-12842127209114945689
    Aug 10, 2021 6:45:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-10T06:45:32.222Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 10, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:45:38.912Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 10, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:45:39.666Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 10, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:45:39.768Z: Expanding GroupByKey operations into optimizable parts.
    Aug 10, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:45:39.795Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 10, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:45:39.910Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 10, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:45:39.940Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 10, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:45:39.965Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 10, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:45:39.990Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 10, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:45:40.309Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2021 6:45:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:45:40.380Z: Starting 5 workers in us-central1-c...
    Aug 10, 2021 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:46:10.297Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 10, 2021 6:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:46:20.263Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 10, 2021 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:46:46.747Z: Workers have started successfully.
    Aug 10, 2021 6:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:46:46.781Z: Workers have started successfully.
    Aug 10, 2021 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:47:15.979Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2021 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:47:16.111Z: Cleaning up.
    Aug 10, 2021 6:47:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:47:16.191Z: Stopping worker pool...
    Aug 10, 2021 6:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:49:42.705Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 10, 2021 6:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T06:49:42.756Z: Worker pool stopped.
    Aug 10, 2021 6:49:48 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-09_23_45_28-12842127209114945689 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 4b38e277-2650-4ac0-808e-5bc32f5f191a and timestamp: 2021-08-10T06:49:48.215000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.584

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2021 6:49:48 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 37.077 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 28s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/xhbgal7ynrfwo

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2282

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2282/display/redirect?page=changes>

Changes:

[heejong] [BEAM-12716] Update XLang Kafka taxi example to use BigQuery sink

[noreply] Merge pull request #15298 from [BEAM-12729] Suppress Avro Runtime

[heejong] add comment

[noreply] Refactor _RemoveDuplicates to use counter state (#15242)


------------------------------------------
[...truncated 365.60 KB...]
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2021 12:48:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2021 12:48:11 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2021 12:48:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 10, 2021 12:48:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 10, 2021 12:48:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 10, 2021 12:48:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 10, 2021 12:48:12 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 10, 2021 12:48:12 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 10, 2021 12:48:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 10, 2021 12:48:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 10, 2021 12:48:17 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 4d4a6bdc7f24cc851e64605e06e8ffb225493b32b854267af832558dff32fc19> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-TUpr3H8kzIUeZGBeBuj_siVJOzK4VCZ6-DJVjf8y_Bk.pb
    Aug 10, 2021 12:48:19 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 10, 2021 12:48:19 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 10, 2021 12:48:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2039390677506403663.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-PD7mdh2x24CFXClt5RyjEdjTegtGKL3223-KxlO6ak0.jar
    Aug 10, 2021 12:48:19 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-g0ohw9VlTkHB2keebpWvYh3yUlHSkBv8XX7mS7Pswbw.jar
    Aug 10, 2021 12:48:20 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 2 files newly uploaded in 1 seconds
    Aug 10, 2021 12:48:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 10, 2021 12:48:20 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 10, 2021 12:48:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 10, 2021 12:48:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 10, 2021 12:48:20 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 10, 2021 12:48:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 10, 2021 12:48:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-09_17_48_21-10909677270399682795?project=apache-beam-testing
    Aug 10, 2021 12:48:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-09_17_48_21-10909677270399682795
    Aug 10, 2021 12:48:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-09_17_48_21-10909677270399682795
    Aug 10, 2021 12:48:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-10T00:48:24.518Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 10, 2021 12:48:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:48:30.258Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 10, 2021 12:48:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:48:31.011Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 10, 2021 12:48:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:48:31.046Z: Expanding GroupByKey operations into optimizable parts.
    Aug 10, 2021 12:48:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:48:31.092Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 10, 2021 12:48:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:48:31.183Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 10, 2021 12:48:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:48:31.220Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 10, 2021 12:48:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:48:31.248Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 10, 2021 12:48:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:48:31.289Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 10, 2021 12:48:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:48:31.661Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2021 12:48:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:48:31.752Z: Starting 5 workers in us-central1-a...
    Aug 10, 2021 12:49:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:49:02.155Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 10, 2021 12:49:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:49:15.596Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 10, 2021 12:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:49:41.037Z: Workers have started successfully.
    Aug 10, 2021 12:49:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:49:41.071Z: Workers have started successfully.
    Aug 10, 2021 12:50:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:50:11.159Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 10, 2021 12:50:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:50:11.318Z: Cleaning up.
    Aug 10, 2021 12:50:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:50:11.392Z: Stopping worker pool...
    Aug 10, 2021 12:52:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:52:31.291Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 10, 2021 12:52:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-10T00:52:31.331Z: Worker pool stopped.
    Aug 10, 2021 12:52:37 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-09_17_48_21-10909677270399682795 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a79b2016-676f-4a15-bd85-fd389b194e7e and timestamp: 2021-08-10T00:52:37.993000000Z:
                     Metric:                    Value:
                   read_time                     8.913
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 10, 2021 12:52:38 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 6 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 36.093 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 16s
152 actionable tasks: 110 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/hchzlpwjio4wo

Stopped 5 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2281

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2281/display/redirect>

Changes:


------------------------------------------
[...truncated 346.32 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 09, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 09, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 09, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 09, 2021 6:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 09, 2021 6:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 09, 2021 6:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115926 bytes, hash ec162277add490ca519158a41fdb3f3b8a28c85a6de8ead4a4414f17f629436e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7BYid63UkMpRkVikH9s_O4ooyFpt6OrUpEFPF_YpQ24.pb
    Aug 09, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 09, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 09, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8665429809620878486.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-1usM4XCEdDnqfmQuzXui9LEO1G8dg0J4517yV3tujm8.jar
    Aug 09, 2021 6:45:04 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 09, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 09, 2021 6:45:04 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 09, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 09, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 09, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 09, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 09, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-09_11_45_04-6823921311771424476?project=apache-beam-testing
    Aug 09, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-09_11_45_04-6823921311771424476
    Aug 09, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-09_11_45_04-6823921311771424476
    Aug 09, 2021 6:45:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-09T18:45:08.179Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 09, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:45:15.133Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 09, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:45:15.911Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 09, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:45:15.960Z: Expanding GroupByKey operations into optimizable parts.
    Aug 09, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:45:16.006Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 09, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:45:16.075Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 09, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:45:16.119Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 09, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:45:16.153Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 09, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:45:16.182Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 09, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:45:16.558Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:45:16.639Z: Starting 5 workers in us-central1-b...
    Aug 09, 2021 6:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:45:27.280Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 09, 2021 6:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:46:01.583Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 09, 2021 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:46:27.462Z: Workers have started successfully.
    Aug 09, 2021 6:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:46:27.482Z: Workers have started successfully.
    Aug 09, 2021 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:47:00.290Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2021 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:47:00.434Z: Cleaning up.
    Aug 09, 2021 6:47:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:47:00.505Z: Stopping worker pool...
    Aug 09, 2021 6:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:49:23.992Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 09, 2021 6:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T18:49:24.025Z: Worker pool stopped.
    Aug 09, 2021 6:49:30 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-09_11_45_04-6823921311771424476 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): a36ea225-449e-4903-b1cc-b28afcbc2959 and timestamp: 2021-08-09T18:49:30.993000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.662

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2021 6:49:31 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 4 mins 43.319 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 13s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/n4rmnoypwwrek

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2280

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2280/display/redirect>

Changes:


------------------------------------------
[...truncated 346.41 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 09, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 09, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 09, 2021 12:45:07 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 09, 2021 12:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 09, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 09, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 21bf2d68c690dae6434c9f97e8c0fa73b188ac65eaa5ce3c8f178ac147aec13e> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Ib8taMaQ2uZDTJ-X6MD6c7GIrGXqpc48jxeKwUeuwT4.pb
    Aug 09, 2021 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 09, 2021 12:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 09, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6533568053976662628.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-kmtn1I9zBhlZv_pIVXIRJg95RTYRI9suLTTrvIv9eLE.jar
    Aug 09, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 09, 2021 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 09, 2021 12:45:16 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 09, 2021 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 09, 2021 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 09, 2021 12:45:16 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 09, 2021 12:45:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 09, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-09_05_45_16-5461144568123952579?project=apache-beam-testing
    Aug 09, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-09_05_45_16-5461144568123952579
    Aug 09, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-09_05_45_16-5461144568123952579
    Aug 09, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-09T12:45:21.187Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 09, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:45:27.335Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 09, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:45:28.049Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 09, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:45:28.108Z: Expanding GroupByKey operations into optimizable parts.
    Aug 09, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:45:28.140Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 09, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:45:28.213Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 09, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:45:28.239Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 09, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:45:28.274Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 09, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:45:28.301Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 09, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:45:28.641Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:45:28.708Z: Starting 5 workers in us-central1-c...
    Aug 09, 2021 12:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:45:55.844Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 09, 2021 12:46:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:46:13.601Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 09, 2021 12:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:46:40.033Z: Workers have started successfully.
    Aug 09, 2021 12:46:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:46:40.067Z: Workers have started successfully.
    Aug 09, 2021 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:47:10.989Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2021 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:47:11.190Z: Cleaning up.
    Aug 09, 2021 12:47:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:47:11.333Z: Stopping worker pool...
    Aug 09, 2021 12:49:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:49:26.540Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 09, 2021 12:49:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T12:49:26.587Z: Worker pool stopped.
    Aug 09, 2021 12:49:34 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-09_05_45_16-5461144568123952579 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c102ebf5-5b68-49f6-9062-87a12d7517e5 and timestamp: 2021-08-09T12:49:34.321000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.178

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2021 12:49:34 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 37.118 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 13s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/oqglkx4sfwie4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2279

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2279/display/redirect>

Changes:


------------------------------------------
[...truncated 346.62 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 09, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 09, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 09, 2021 6:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 09, 2021 6:44:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 09, 2021 6:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 09, 2021 6:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash 85435a5f0027011066996b84a36eeccd1152bec08505a5be0e12f9ae4dd80a49> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hUNaXwAnARBmmWuEo27szRFSvsCFBaW-DhL5rk3YCkk.pb
    Aug 09, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 09, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 09, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8103399717848889755.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-bFol2UsD4ciYmQFNUtogeSTIjb6awl1pWn2EMEkhhmY.jar
    Aug 09, 2021 6:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 09, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 09, 2021 6:45:03 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 09, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 09, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 09, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 09, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 09, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-08_23_45_04-10933481903456741915?project=apache-beam-testing
    Aug 09, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-08_23_45_04-10933481903456741915
    Aug 09, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-08_23_45_04-10933481903456741915
    Aug 09, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-09T06:45:07.262Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 09, 2021 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:14.789Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 09, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:15.603Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 09, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:15.646Z: Expanding GroupByKey operations into optimizable parts.
    Aug 09, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:15.682Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 09, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:15.749Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 09, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:15.795Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 09, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:15.821Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 09, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:15.847Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 09, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:16.166Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2021 6:45:17 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:16.229Z: Starting 5 workers in us-central1-b...
    Aug 09, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:20.140Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 09, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:51.002Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 09, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:45:51.031Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 09, 2021 6:46:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:46:01.342Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 09, 2021 6:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:46:25.053Z: Workers have started successfully.
    Aug 09, 2021 6:46:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:46:25.082Z: Workers have started successfully.
    Aug 09, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:46:55.176Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:46:55.305Z: Cleaning up.
    Aug 09, 2021 6:46:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:46:55.380Z: Stopping worker pool...
    Aug 09, 2021 6:49:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:49:16.164Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 09, 2021 6:49:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T06:49:16.203Z: Worker pool stopped.
    Aug 09, 2021 6:49:22 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-08_23_45_04-10933481903456741915 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 33ae6e04-df33-4a79-bb82-73056fdf7acc and timestamp: 2021-08-09T06:49:22.548000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     7.864

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2021 6:49:23 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 35.313 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 3s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/bwwqvfo5bg2w4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2278

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2278/display/redirect>

Changes:


------------------------------------------
[...truncated 347.50 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 09, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 09, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 09, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 09, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 09, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 09, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 09, 2021 12:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 09, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash b1c9c802a3435f3be3888cc8f4e71e0a88f5fc07b5cd1a1ba3b3159f7c6fb92b> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-scnIAqNDXzvjiIzI9OceCoj1_Ae1zRobo7MVn3xvuSs.pb
    Aug 09, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 09, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 09, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9199385622573194346.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-c1PXxe0paR4y8j4hlKUHW7nY8Ve1p-0M-kbz9xfL00A.jar
    Aug 09, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 09, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 09, 2021 12:45:05 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 09, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 09, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 09, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 09, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 09, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-08_17_45_05-1782547972041225711?project=apache-beam-testing
    Aug 09, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-08_17_45_05-1782547972041225711
    Aug 09, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-08_17_45_05-1782547972041225711
    Aug 09, 2021 12:45:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-09T00:45:08.874Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 09, 2021 12:45:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:45:22.141Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 09, 2021 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:45:22.708Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 09, 2021 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:45:22.740Z: Expanding GroupByKey operations into optimizable parts.
    Aug 09, 2021 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:45:22.762Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 09, 2021 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:45:22.822Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 09, 2021 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:45:22.850Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 09, 2021 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:45:22.884Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 09, 2021 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:45:22.915Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 09, 2021 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:45:23.271Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2021 12:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:45:23.337Z: Starting 5 workers in us-central1-c...
    Aug 09, 2021 12:45:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:45:42.733Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 09, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:46:07.871Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 09, 2021 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:46:35.565Z: Workers have started successfully.
    Aug 09, 2021 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:46:35.601Z: Workers have started successfully.
    Aug 09, 2021 12:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:47:04.973Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 09, 2021 12:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:47:05.105Z: Cleaning up.
    Aug 09, 2021 12:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:47:05.186Z: Stopping worker pool...
    Aug 09, 2021 12:49:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:49:23.998Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 09, 2021 12:49:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-09T00:49:24.049Z: Worker pool stopped.
    Aug 09, 2021 12:49:29 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-08_17_45_05-1782547972041225711 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b6665455-4c8b-4594-a7d1-4e4587884b09 and timestamp: 2021-08-09T00:49:29.564000000Z:
                     Metric:                    Value:
                   read_time                     8.336
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 09, 2021 12:49:30 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 41.318 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/jgmi3m3rlekjk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2277

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2277/display/redirect>

Changes:


------------------------------------------
[...truncated 347.62 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 08, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 08, 2021 6:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 08, 2021 6:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 08, 2021 6:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash d371ce936f0f6bafb7331e73a978948373c7387cafeb2e8a5b8d2d1f899495b0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-03HOk28Pa6-3Mx5zqXiUg3PHOHyv6y6KW40tH4mUlbA.pb
    Aug 08, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 08, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 08, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test9090659628900193495.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-WGYZQPgGuhskIfiGrzRUBXS-piY6q7fTh2pq8TO7bWg.jar
    Aug 08, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 08, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 08, 2021 6:45:03 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 08, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 08, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 08, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 08, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 08, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-08_11_45_04-10226450268225147910?project=apache-beam-testing
    Aug 08, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-08_11_45_04-10226450268225147910
    Aug 08, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-08_11_45_04-10226450268225147910
    Aug 08, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-08T18:45:07.864Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 08, 2021 6:45:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:45:15.560Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 08, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:45:16.206Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 08, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:45:16.238Z: Expanding GroupByKey operations into optimizable parts.
    Aug 08, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:45:16.257Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 08, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:45:16.333Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 08, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:45:16.370Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 08, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:45:16.393Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 08, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:45:16.425Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 08, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:45:16.744Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2021 6:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:45:16.820Z: Starting 5 workers in us-central1-b...
    Aug 08, 2021 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:45:42.687Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 08, 2021 6:46:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:46:06.177Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 08, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:46:32.671Z: Workers have started successfully.
    Aug 08, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:46:32.725Z: Workers have started successfully.
    Aug 08, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:47:05.423Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:47:05.561Z: Cleaning up.
    Aug 08, 2021 6:47:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:47:05.628Z: Stopping worker pool...
    Aug 08, 2021 6:49:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:49:19.671Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 08, 2021 6:49:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T18:49:19.703Z: Worker pool stopped.
    Aug 08, 2021 6:49:26 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-08_11_45_04-10226450268225147910 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): e253b48a-1ee4-4e08-a75a-c3509bd2432a and timestamp: 2021-08-08T18:49:26.311000000Z:
                     Metric:                    Value:
                   read_time                    11.023
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2021 6:49:26 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 38.645 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 7s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/dmyc6badi4vec

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2276

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2276/display/redirect>

Changes:


------------------------------------------
[...truncated 348.51 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2021 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2021 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2021 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2021 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2021 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2021 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2021 12:45:13 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 08, 2021 12:45:13 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 08, 2021 12:45:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 08, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 08, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 18ddd38fb4c2973fec3b038b3eec99f478d43fc8df39bf83f1d855dc08080c47> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-GN3Tj7TClz_sOwOLPuyZ9HjUP8jfOb-D8dhV3AgIDEc.pb
    Aug 08, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 08, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2183557962577290597.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-T69nbdpkDu9VQHF--Lm8i34Ucj9oo4Xg737Ci8lWI3U.jar
    Aug 08, 2021 12:45:20 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 08, 2021 12:45:21 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 08, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 08, 2021 12:45:21 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 08, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 08, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 08, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 08, 2021 12:45:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 08, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-08_05_45_21-5764693577363716533?project=apache-beam-testing
    Aug 08, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-08_05_45_21-5764693577363716533
    Aug 08, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-08_05_45_21-5764693577363716533
    Aug 08, 2021 12:45:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-08T12:45:27.434Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:45:33.892Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:45:34.545Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:45:34.581Z: Expanding GroupByKey operations into optimizable parts.
    Aug 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:45:34.612Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:45:34.703Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:45:34.757Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:45:34.788Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:45:34.827Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:45:35.233Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:45:35.329Z: Starting 5 workers in us-central1-c...
    Aug 08, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:46:03.530Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 08, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:46:09.597Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 08, 2021 12:46:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:46:09.633Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 08, 2021 12:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:46:19.908Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 08, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:46:43.109Z: Workers have started successfully.
    Aug 08, 2021 12:46:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:46:43.135Z: Workers have started successfully.
    Aug 08, 2021 12:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:47:13.638Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2021 12:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:47:13.801Z: Cleaning up.
    Aug 08, 2021 12:47:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:47:13.882Z: Stopping worker pool...
    Aug 08, 2021 12:49:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:49:40.790Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 08, 2021 12:49:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T12:49:40.834Z: Worker pool stopped.
    Aug 08, 2021 12:49:47 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-08_05_45_21-5764693577363716533 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f88dad61-d8d6-4908-a8e2-be68076dfb8c and timestamp: 2021-08-08T12:49:47.695000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.726

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2021 12:49:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 47.839 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 27s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/n7y4xudjxdt5o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2275

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2275/display/redirect>

Changes:


------------------------------------------
[...truncated 347.02 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2021 6:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 08, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 08, 2021 6:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 08, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 08, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash c83e8d14db8dc3a19c9b75fff57b15f4ba9aeca101f80ec87e0e7d0f6cd30037> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-yD6NFNuNw6Gcm3X_9XsV9Lqa7KEB-A7Ifg59D2zTADc.pb
    Aug 08, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 08, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 08, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8024809353571676450.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-UUN25kQs6sjAKnTRNn49Yn-iph4zmp3PYyNq4q0PsYY.jar
    Aug 08, 2021 6:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 08, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 08, 2021 6:45:05 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 08, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 08, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 08, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 08, 2021 6:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 08, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-07_23_45_05-4773555866399165157?project=apache-beam-testing
    Aug 08, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-07_23_45_05-4773555866399165157
    Aug 08, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-07_23_45_05-4773555866399165157
    Aug 08, 2021 6:45:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-08T06:45:09.766Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 08, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:45:17.000Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 08, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:45:17.603Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 08, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:45:17.647Z: Expanding GroupByKey operations into optimizable parts.
    Aug 08, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:45:17.675Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 08, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:45:17.749Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 08, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:45:17.782Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 08, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:45:17.832Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 08, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:45:17.863Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 08, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:45:18.192Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:45:18.270Z: Starting 5 workers in us-central1-b...
    Aug 08, 2021 6:45:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:45:37.598Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 08, 2021 6:46:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:46:05.766Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 08, 2021 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:46:30.820Z: Workers have started successfully.
    Aug 08, 2021 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:46:30.864Z: Workers have started successfully.
    Aug 08, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:47:03.700Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:47:03.835Z: Cleaning up.
    Aug 08, 2021 6:47:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:47:03.908Z: Stopping worker pool...
    Aug 08, 2021 6:49:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:49:18.768Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 08, 2021 6:49:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T06:49:18.849Z: Worker pool stopped.
    Aug 08, 2021 6:49:23 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-07_23_45_05-4773555866399165157 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 9aea45a4-5394-42a4-a20d-00458c7bfc92 and timestamp: 2021-08-08T06:49:23.888000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.905

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2021 6:49:24 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 34.85 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 5s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/ytk2emtesj5es

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2274

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2274/display/redirect>

Changes:


------------------------------------------
[...truncated 345.69 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 08, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 08, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 08, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 08, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 08, 2021 12:44:57 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 08, 2021 12:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 08, 2021 12:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 08, 2021 12:45:00 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115925 bytes, hash aef3b7c215fb78c903a8e7b2e05df13f7ab8747e0fa500d410c873165c5c35a6> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-rvO3whX7eMkDqOey4F3xP3q4dH4PpQDUEMhzFlxcNaY.pb
    Aug 08, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 08, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 08, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4342773786340832923.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-XBhzSJLTqaaISzTesld_a-2pynCdwmHzAgjeWonh4qo.jar
    Aug 08, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 08, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 08, 2021 12:45:03 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 08, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 08, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 08, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 08, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 08, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-07_17_45_03-15791955222139124365?project=apache-beam-testing
    Aug 08, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-07_17_45_03-15791955222139124365
    Aug 08, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-07_17_45_03-15791955222139124365
    Aug 08, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-08T00:45:07.046Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 08, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:12.239Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 08, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:12.745Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 08, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:12.776Z: Expanding GroupByKey operations into optimizable parts.
    Aug 08, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:12.807Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 08, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:12.876Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 08, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:12.897Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 08, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:12.927Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 08, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:12.959Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 08, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:13.288Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2021 12:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:13.369Z: Starting 5 workers in us-central1-a...
    Aug 08, 2021 12:45:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:40.749Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 08, 2021 12:45:55 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:45:55.674Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 08, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:46:20.450Z: Workers have started successfully.
    Aug 08, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:46:20.468Z: Workers have started successfully.
    Aug 08, 2021 12:46:49 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:46:49.020Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 08, 2021 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:46:49.147Z: Cleaning up.
    Aug 08, 2021 12:46:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:46:49.218Z: Stopping worker pool...
    Aug 08, 2021 12:49:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:49:10.503Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 08, 2021 12:49:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-08T00:49:10.554Z: Worker pool stopped.
    Aug 08, 2021 12:49:18 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-07_17_45_03-15791955222139124365 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 92572a61-5654-4e4f-b380-319c87718076 and timestamp: 2021-08-08T00:49:18.715000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.846

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 08, 2021 12:49:19 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4 mins 31.218 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 59s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/djg5ifxluffg4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2273

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2273/display/redirect>

Changes:


------------------------------------------
[...truncated 359.77 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2021 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2021 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2021 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2021 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2021 6:46:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2021 6:46:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2021 6:46:19 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 07, 2021 6:46:19 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 07, 2021 6:46:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 07, 2021 6:46:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 07, 2021 6:46:23 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash a71c604fcb9178a8bf2ba288e97af19069f924367dcf3b2eec49567d8839902a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-pxxgT8uReKi_K6KI6XrxkGn5JDZ9zzsu7ElWfYg5kCo.pb
    Aug 07, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 07, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6499599034364623817.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-MvYj50daebvSlWVG563LnXngClESP5jx333bADPdMjs.jar
    Aug 07, 2021 6:46:25 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 07, 2021 6:46:26 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 07, 2021 6:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 07, 2021 6:46:26 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 07, 2021 6:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 07, 2021 6:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 07, 2021 6:46:26 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 07, 2021 6:46:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 07, 2021 6:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-07_11_46_26-6628422150763392756?project=apache-beam-testing
    Aug 07, 2021 6:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-07_11_46_26-6628422150763392756
    Aug 07, 2021 6:46:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-07_11_46_26-6628422150763392756
    Aug 07, 2021 6:46:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-07T18:46:30.212Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 07, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:46:37.102Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 07, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:46:38.024Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 07, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:46:38.053Z: Expanding GroupByKey operations into optimizable parts.
    Aug 07, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:46:38.073Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 07, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:46:38.145Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 07, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:46:38.172Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 07, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:46:38.225Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 07, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:46:38.252Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 07, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:46:38.579Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:46:38.645Z: Starting 5 workers in us-central1-a...
    Aug 07, 2021 6:46:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:46:54.237Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 07, 2021 6:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:47:24.424Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 07, 2021 6:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:47:50.407Z: Workers have started successfully.
    Aug 07, 2021 6:47:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:47:50.444Z: Workers have started successfully.
    Aug 07, 2021 6:48:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:48:20.159Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2021 6:48:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:48:20.307Z: Cleaning up.
    Aug 07, 2021 6:48:21 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:48:20.380Z: Stopping worker pool...
    Aug 07, 2021 6:50:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:50:48.424Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 07, 2021 6:50:51 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T18:50:48.463Z: Worker pool stopped.
    Aug 07, 2021 6:50:53 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-07_11_46_26-6628422150763392756 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6cb1ce3f-17c1-4a8b-9741-7fe46d4deb39 and timestamp: 2021-08-07T18:50:53.619000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.652

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2021 6:50:54 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 44.535 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 35s
152 actionable tasks: 104 executed, 48 from cache

Publishing build scan...
https://gradle.com/s/ldiqjzgbkupky

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2272

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2272/display/redirect>

Changes:


------------------------------------------
[...truncated 353.80 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2021 12:45:23 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2021 12:45:24 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 07, 2021 12:45:24 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 07, 2021 12:45:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 07, 2021 12:45:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 07, 2021 12:45:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 669f691e13842925a6b253961002d8e971b7ab9b1bb78fc7f3d09e4a778e6b76> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Zp9pHhOEKSWmslOWEALY6XG3q5sbt4_H89CeSneOa3Y.pb
    Aug 07, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 07, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 07, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3726676162388734741.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-T_CNxhMQbw5QLrpCrRl4o3apJwhdLb9rN5MbhSdBek0.jar
    Aug 07, 2021 12:45:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 07, 2021 12:45:31 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 07, 2021 12:45:31 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 07, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 07, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 07, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 07, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 07, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-07_05_45_32-11412483887406653057?project=apache-beam-testing
    Aug 07, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-07_05_45_32-11412483887406653057
    Aug 07, 2021 12:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-07_05_45_32-11412483887406653057
    Aug 07, 2021 12:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-07T12:45:35.778Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 07, 2021 12:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:45:42.323Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 07, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:45:43.059Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 07, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:45:43.090Z: Expanding GroupByKey operations into optimizable parts.
    Aug 07, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:45:43.117Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 07, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:45:43.193Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 07, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:45:43.231Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 07, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:45:43.264Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 07, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:45:43.298Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 07, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:45:43.670Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2021 12:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:45:43.744Z: Starting 5 workers in us-central1-c...
    Aug 07, 2021 12:46:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:46:08.235Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 07, 2021 12:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:46:26.617Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 07, 2021 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:46:52.644Z: Workers have started successfully.
    Aug 07, 2021 12:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:46:52.685Z: Workers have started successfully.
    Aug 07, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:47:23.266Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:47:23.407Z: Cleaning up.
    Aug 07, 2021 12:47:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:47:23.482Z: Stopping worker pool...
    Aug 07, 2021 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:49:49.624Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 07, 2021 12:49:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T12:49:49.663Z: Worker pool stopped.
    Aug 07, 2021 12:49:55 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-07_05_45_32-11412483887406653057 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 825fdeea-f89a-478f-b9c9-eed4c4b4fcf0 and timestamp: 2021-08-07T12:49:55.541000000Z:
                     Metric:                    Value:
                   read_time                     8.811
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2021 12:49:56 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 42.298 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 36s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/7br2kr6te35aa

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2271

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2271/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #15246 from [BEAM-12685] Allow managed thread count


------------------------------------------
[...truncated 358.73 KB...]
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2021 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2021 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2021 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2021 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2021 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2021 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2021 6:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 07, 2021 6:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 07, 2021 6:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 07, 2021 6:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 07, 2021 6:45:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115861 bytes, hash 0e9ad1809a212626a019cbf78b7bc365d2b3904b1de5b987cb4cc278254cb884> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-DprRgJohJiagGcv3i3vDZdKzkEsd5bmHy0zCeCVMuIQ.pb
    Aug 07, 2021 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 07, 2021 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 07, 2021 6:45:37 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3365102568998228474.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-paMvU7wN5qWtO_oy4fUGTd6E70HvcAIzmrotU5q8kiM.jar
    Aug 07, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
    Aug 07, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
    Aug 07, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
    Aug 07, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
    Aug 07, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 243 files cached, 5 files newly uploaded in 0 seconds
    Aug 07, 2021 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 07, 2021 6:45:38 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 07, 2021 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 07, 2021 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 07, 2021 6:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 07, 2021 6:45:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 07, 2021 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-06_23_45_39-4861045248339946861?project=apache-beam-testing
    Aug 07, 2021 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-06_23_45_39-4861045248339946861
    Aug 07, 2021 6:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-06_23_45_39-4861045248339946861
    Aug 07, 2021 6:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-07T06:45:42.301Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 07, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:45:50.485Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 07, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:45:51.145Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 07, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:45:51.177Z: Expanding GroupByKey operations into optimizable parts.
    Aug 07, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:45:51.200Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 07, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:45:51.307Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 07, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:45:51.374Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 07, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:45:51.409Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 07, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:45:51.433Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 07, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:45:51.815Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2021 6:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:45:51.897Z: Starting 5 workers in us-central1-c...
    Aug 07, 2021 6:46:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:46:09.275Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 07, 2021 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:46:32.176Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 07, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:46:58.778Z: Workers have started successfully.
    Aug 07, 2021 6:46:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:46:58.812Z: Workers have started successfully.
    Aug 07, 2021 6:47:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:47:30.827Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2021 6:47:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:47:30.961Z: Cleaning up.
    Aug 07, 2021 6:47:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:47:31.041Z: Stopping worker pool...
    Aug 07, 2021 6:49:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:49:50.597Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 07, 2021 6:49:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T06:49:50.645Z: Worker pool stopped.
    Aug 07, 2021 6:49:56 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-06_23_45_39-4861045248339946861 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b1b11a63-44f2-4315-9378-116557fcb438 and timestamp: 2021-08-07T06:49:56.885000000Z:
                     Metric:                    Value:
                   read_time                     9.357
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2021 6:49:57 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 41.136 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 39s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/7qmutqq5656t6

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2270

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2270/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #15249: [BEAM-12690] Fix GroupIntoBatches watermark


------------------------------------------
[...truncated 347.62 KB...]
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 07, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 07, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 07, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 07, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 07, 2021 12:45:05 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 07, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 07, 2021 12:45:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 07, 2021 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 8d3efd14a137e750fa111e075a9b27f188167d4ef6e6da2d222086dcffb90883> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-jT79FKE351D6ER4HWpsn8YgWfU725totIiCG3P-5CIM.pb
    Aug 07, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 07, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 07, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7117191162157084175.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-xmyunhZDbbwZHralmsiw82B7UGhAsQVNs9t6F0Ne6v8.jar
    Aug 07, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.33.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.33.0-SNAPSHOT-qsyth_X-2AynopJtGlEp0y9ZYWCE9nAZ-Mad3LL61Fw.jar
    Aug 07, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/thrift/build/libs/beam-sdks-java-io-thrift-2.33.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-thrift-2.33.0-SNAPSHOT-4bF8QpaxkWlkdPssX5OV0S9rIzFuRU0pojrN2fBwjjk.jar
    Aug 07, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.33.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.33.0-SNAPSHOT-xz-aSlhizJi8jigGo3amZ2g_MkglE3dBulicsi2UX5U.jar
    Aug 07, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/udf/build/libs/beam-sdks-java-extensions-sql-udf-2.33.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-udf-2.33.0-SNAPSHOT-MlQQAS1Je1cH4lXtIOKM3zL6o97S5F7sJQ3zUailw08.jar
    Aug 07, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 243 files cached, 5 files newly uploaded in 0 seconds
    Aug 07, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 07, 2021 12:45:12 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 07, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 07, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 07, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 07, 2021 12:45:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 07, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-06_17_45_12-2171249495741172773?project=apache-beam-testing
    Aug 07, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-06_17_45_12-2171249495741172773
    Aug 07, 2021 12:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-06_17_45_12-2171249495741172773
    Aug 07, 2021 12:45:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-07T00:45:15.803Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 07, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:45:22.189Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 07, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:45:22.832Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 07, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:45:22.874Z: Expanding GroupByKey operations into optimizable parts.
    Aug 07, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:45:22.911Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 07, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:45:22.985Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 07, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:45:23.010Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 07, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:45:23.045Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 07, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:45:23.081Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 07, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:45:23.372Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2021 12:45:23 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:45:23.439Z: Starting 5 workers in us-central1-a...
    Aug 07, 2021 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:45:27.692Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 07, 2021 12:46:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:46:04.016Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 07, 2021 12:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:46:31.005Z: Workers have started successfully.
    Aug 07, 2021 12:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:46:31.031Z: Workers have started successfully.
    Aug 07, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:46:59.643Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 07, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:46:59.787Z: Cleaning up.
    Aug 07, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:46:59.849Z: Stopping worker pool...
    Aug 07, 2021 12:49:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:49:17.836Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 07, 2021 12:49:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-07T00:49:17.875Z: Worker pool stopped.
    Aug 07, 2021 12:49:25 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-06_17_45_12-2171249495741172773 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 181be229-d3a1-40f9-afa6-3c09bc04d166 and timestamp: 2021-08-07T00:49:25.137000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.823

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 07, 2021 12:49:25 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 30.071 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 2s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/hb4ujcpcqqhlg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2269

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2269/display/redirect?page=changes>

Changes:

[noreply] [BEAM-10212] Integrate caching client (#15214)


------------------------------------------
[...truncated 354.02 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2021 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2021 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2021 6:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2021 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2021 6:45:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2021 6:45:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2021 6:45:26 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 06, 2021 6:45:26 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 06, 2021 6:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 06, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 06, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 841cc3e8803bddd17d4b380b8bff916ffaacd94b376f782d69741b25a1f33f97> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-hBzD6IA73dF9SzgLi_-Rb_qs2Us3b3gtaXQbJaHzP5c.pb
    Aug 06, 2021 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 06, 2021 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 06, 2021 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2313379777724394158.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-UXfYID0v56-RHH4d__pyWoOGPTonxj79drd5U3zPijk.jar
    Aug 06, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 06, 2021 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 06, 2021 6:45:33 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 06, 2021 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 06, 2021 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 06, 2021 6:45:33 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 06, 2021 6:45:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 06, 2021 6:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-06_11_45_33-7543398404474817505?project=apache-beam-testing
    Aug 06, 2021 6:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-06_11_45_33-7543398404474817505
    Aug 06, 2021 6:45:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-06_11_45_33-7543398404474817505
    Aug 06, 2021 6:45:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-06T18:45:37.490Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 06, 2021 6:45:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:45:43.540Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 06, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:45:44.137Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 06, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:45:44.168Z: Expanding GroupByKey operations into optimizable parts.
    Aug 06, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:45:44.198Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 06, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:45:44.266Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 06, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:45:44.305Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 06, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:45:44.336Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 06, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:45:44.368Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 06, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:45:44.727Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2021 6:45:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:45:44.810Z: Starting 5 workers in us-central1-a...
    Aug 06, 2021 6:45:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:45:53.267Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 06, 2021 6:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:46:19.032Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 06, 2021 6:46:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:46:19.054Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 06, 2021 6:46:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:46:29.266Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 06, 2021 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:46:53.893Z: Workers have started successfully.
    Aug 06, 2021 6:46:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:46:53.927Z: Workers have started successfully.
    Aug 06, 2021 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:47:23.129Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2021 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:47:23.252Z: Cleaning up.
    Aug 06, 2021 6:47:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:47:23.324Z: Stopping worker pool...
    Aug 06, 2021 6:49:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:49:33.677Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 06, 2021 6:49:35 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T18:49:33.711Z: Worker pool stopped.
    Aug 06, 2021 6:49:40 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-06_11_45_33-7543398404474817505 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f5aa5bc4-fcf6-400c-b7fe-f95d6685203d and timestamp: 2021-08-06T18:49:40.834000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                      8.58

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2021 6:49:41 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 25.269 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 20s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/cvgsg2o7c7ijg

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2268

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2268/display/redirect?page=changes>

Changes:

[emilyye] bump FnAPI container

[Ismaël Mejía] [BEAM-12628] Add Avro reflect-based Coder option


------------------------------------------
[...truncated 357.38 KB...]
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2021 12:53:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2021 12:53:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2021 12:53:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2021 12:53:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2021 12:53:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2021 12:53:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2021 12:53:25 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 06, 2021 12:53:25 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 06, 2021 12:53:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 06, 2021 12:53:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 06, 2021 12:53:29 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash b8de99a3279d721db99e23af30dfa3a5a33f1a7e5982cf8d691cc723bb830256> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-uN6Zoyedch25niOvMN-jpaM_Gn5Zgs-NaRzHI7uDAlY.pb
    Aug 06, 2021 12:53:31 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 06, 2021 12:53:31 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 06, 2021 12:53:31 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1015731094087299925.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-YEZjQqWMVh0wpd3cOs5lmafk3qlXqzdjzzq1qzk1BvI.jar
    Aug 06, 2021 12:53:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.alibaba/fastjson/1.2.69/6cb063f1d527ff65bdbb9ea74888a5ffc3f92197/fastjson-1.2.69.jar to gs://temp-storage-for-perf-tests/loadtests/staging/fastjson-1.2.69-KniRdEoDeOAJmpuB3b7U0uGn6GMvb4b4_8-jg-UeScg.jar
    Aug 06, 2021 12:53:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 2 files newly uploaded in 0 seconds
    Aug 06, 2021 12:53:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 06, 2021 12:53:32 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 06, 2021 12:53:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 06, 2021 12:53:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 06, 2021 12:53:32 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 06, 2021 12:53:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 06, 2021 12:53:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-06_05_53_32-14751120400713260365?project=apache-beam-testing
    Aug 06, 2021 12:53:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-06_05_53_32-14751120400713260365
    Aug 06, 2021 12:53:33 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-06_05_53_32-14751120400713260365
    Aug 06, 2021 12:53:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-06T12:53:36.519Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 06, 2021 12:53:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:53:43.465Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 06, 2021 12:53:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:53:44.213Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 06, 2021 12:53:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:53:44.249Z: Expanding GroupByKey operations into optimizable parts.
    Aug 06, 2021 12:53:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:53:44.275Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 06, 2021 12:53:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:53:44.353Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 06, 2021 12:53:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:53:44.389Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 06, 2021 12:53:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:53:44.413Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 06, 2021 12:53:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:53:44.442Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 06, 2021 12:53:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:53:44.937Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2021 12:53:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:53:45.021Z: Starting 5 workers in us-central1-c...
    Aug 06, 2021 12:53:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:53:50.855Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 06, 2021 12:54:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:54:27.975Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 06, 2021 12:54:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:54:54.318Z: Workers have started successfully.
    Aug 06, 2021 12:54:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:54:54.348Z: Workers have started successfully.
    Aug 06, 2021 12:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:55:26.143Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2021 12:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:55:26.325Z: Cleaning up.
    Aug 06, 2021 12:55:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:55:26.408Z: Stopping worker pool...
    Aug 06, 2021 12:57:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:57:42.551Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 06, 2021 12:57:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T12:57:42.587Z: Worker pool stopped.
    Aug 06, 2021 12:57:48 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-06_05_53_32-14751120400713260365 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d21e9300-4986-4436-8cad-7d0478233b80 and timestamp: 2021-08-06T12:57:48.065000000Z:
                     Metric:                    Value:
                   read_time                     8.894
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2021 12:57:48 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
    WARNING: Missing property -- measurement/database. Metrics won't be published.

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 32.142 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 20s
152 actionable tasks: 104 executed, 48 from cache

Publishing build scan...
https://gradle.com/s/xa7bhksfe22hq

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2267

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2267/display/redirect>

Changes:


------------------------------------------
[...truncated 351.44 KB...]
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 06, 2021 6:45:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 06, 2021 6:45:18 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 06, 2021 6:45:22 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 06, 2021 6:45:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 78bff1b002885bacfb35b597b36f1e3a433f690a5b45a8e82c23e2a09ef92199> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-eL_xsAKIW6z7NbWXs28eOkM_aQpbRajoLCPioJ75IZk.pb
    Aug 06, 2021 6:45:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 06, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 06, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test620361598293345667.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-ampQU-OxJV9MZ7yOKCsNOuxYGnSi_Jz32vCL7ydFQ00.jar
    Aug 06, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 06, 2021 6:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 06, 2021 6:45:25 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 06, 2021 6:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 06, 2021 6:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 06, 2021 6:45:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 06, 2021 6:45:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 06, 2021 6:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-05_23_45_25-5214263957952220757?project=apache-beam-testing
    Aug 06, 2021 6:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-05_23_45_25-5214263957952220757
    Aug 06, 2021 6:45:26 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-05_23_45_25-5214263957952220757
    Aug 06, 2021 6:45:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-06T06:45:29.351Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 06, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:45:36.572Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 06, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:45:37.270Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 06, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:45:37.297Z: Expanding GroupByKey operations into optimizable parts.
    Aug 06, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:45:37.342Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 06, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:45:37.418Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 06, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:45:37.446Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 06, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:45:37.480Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 06, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:45:37.513Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 06, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:45:37.840Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2021 6:45:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:45:37.916Z: Starting 5 workers in us-central1-b...
    Aug 06, 2021 6:46:04 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:46:04.409Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 06, 2021 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:46:18.422Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 06, 2021 6:46:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:46:18.459Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 06, 2021 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:46:28.778Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 06, 2021 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:46:52.614Z: Workers have started successfully.
    Aug 06, 2021 6:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:46:52.648Z: Workers have started successfully.
    Aug 06, 2021 6:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:47:23.831Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2021 6:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:47:24.012Z: Cleaning up.
    Aug 06, 2021 6:47:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:47:24.114Z: Stopping worker pool...
    Aug 06, 2021 6:49:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:49:42.964Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 06, 2021 6:49:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T06:49:43.013Z: Worker pool stopped.
    Aug 06, 2021 6:49:49 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-05_23_45_25-5214263957952220757 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3124ea07-2d80-423f-838d-9ae0c181800e and timestamp: 2021-08-06T06:49:49.043000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.719

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2021 6:49:49 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 40.351 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 30s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
Publishing build scan failed due to network error 'java.net.SocketTimeoutException: Read timed out' (2 retries remaining)...
https://gradle.com/s/5zib3m5odpcyk

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2266

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2266/display/redirect?page=changes>

Changes:

[noreply] Revert "[BEAM-11934] Remove Dataflow override of streaming WriteFiles

[andyxu] Add google cloud heap profiling support to beam java sdk container


------------------------------------------
[...truncated 351.14 KB...]
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2021 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2021 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2021 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 06, 2021 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 06, 2021 12:45:30 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 06, 2021 12:45:30 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 06, 2021 12:45:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 06, 2021 12:45:31 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 06, 2021 12:45:31 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 06, 2021 12:45:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 06, 2021 12:45:36 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash e3c0d026600b0a3c0bf7872751b1907b4b2ab58c98bd8f31a29d84fb8d1925ef> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-48DQJmALCjwL94cnUbGQe0sqtYyYvY8xop2E-40ZJe8.pb
    Aug 06, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 06, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 06, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7485032491544716632.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-gGYRzj3qoxjuPjLpeA0b-9zDQmUIV8SVf3D_rOWWkMc.jar
    Aug 06, 2021 12:45:38 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 06, 2021 12:45:38 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 06, 2021 12:45:38 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 06, 2021 12:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 06, 2021 12:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 06, 2021 12:45:39 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 06, 2021 12:45:39 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 06, 2021 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-05_17_45_39-14615322108182942192?project=apache-beam-testing
    Aug 06, 2021 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-05_17_45_39-14615322108182942192
    Aug 06, 2021 12:45:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-05_17_45_39-14615322108182942192
    Aug 06, 2021 12:45:44 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-06T00:45:42.675Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 06, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:45:51.112Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 06, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:45:51.640Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 06, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:45:51.680Z: Expanding GroupByKey operations into optimizable parts.
    Aug 06, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:45:51.714Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 06, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:45:51.793Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 06, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:45:51.818Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 06, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:45:51.854Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 06, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:45:51.886Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 06, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:45:52.192Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2021 12:45:52 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:45:52.268Z: Starting 5 workers in us-central1-b...
    Aug 06, 2021 12:46:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:46:18.998Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 06, 2021 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:46:26.967Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 06, 2021 12:46:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:46:27.004Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 06, 2021 12:46:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:46:37.351Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 06, 2021 12:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:47:01.695Z: Workers have started successfully.
    Aug 06, 2021 12:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:47:01.725Z: Workers have started successfully.
    Aug 06, 2021 12:47:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:47:33.615Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 06, 2021 12:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:47:33.752Z: Cleaning up.
    Aug 06, 2021 12:47:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:47:33.840Z: Stopping worker pool...
    Aug 06, 2021 12:49:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:49:53.046Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 06, 2021 12:49:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-06T00:49:53.094Z: Worker pool stopped.
    Aug 06, 2021 12:49:59 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-05_17_45_39-14615322108182942192 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b8f4cee5-260d-46b5-9db1-de3bfcdfacfe and timestamp: 2021-08-06T00:49:59.465000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.709

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 06, 2021 12:49:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 39.022 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 39s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/kbmkpe3hrvfdy

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2265

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2265/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12601] Add append-only option (#15257)


------------------------------------------
[...truncated 346.35 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@792032091]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2021 6:44:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2021 6:44:47 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 05, 2021 6:44:48 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 05, 2021 6:44:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 05, 2021 6:44:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 05, 2021 6:44:53 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 3da50e2b70ed87fd9cf63437335fe3f37b255aeb524d8f421028c637674fee94> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-PaUOK3Dth_2c9jQ3M1_j83slWutSTY9CECjGN2dP7pQ.pb
    Aug 05, 2021 6:44:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 05, 2021 6:44:56 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 05, 2021 6:44:56 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8852320340602886575.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-8Mbk8zxKP0INzzgnQICsw4JdJtlaLNrLNKm345UFCCA.jar
    Aug 05, 2021 6:44:56 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 05, 2021 6:44:56 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 05, 2021 6:44:56 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 05, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 05, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 05, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 05, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 05, 2021 6:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-05_11_44_57-5929571103769694781?project=apache-beam-testing
    Aug 05, 2021 6:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-05_11_44_57-5929571103769694781
    Aug 05, 2021 6:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-05_11_44_57-5929571103769694781
    Aug 05, 2021 6:45:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-05T18:45:01.054Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:06.112Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:06.702Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:06.734Z: Expanding GroupByKey operations into optimizable parts.
    Aug 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:06.766Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:06.860Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:06.891Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:06.923Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:06.944Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 05, 2021 6:45:07 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:07.342Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2021 6:45:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:07.425Z: Starting 5 workers in us-central1-a...
    Aug 05, 2021 6:45:15 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:13.876Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 05, 2021 6:45:44 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:45:43.228Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 05, 2021 6:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:46:08.756Z: Workers have started successfully.
    Aug 05, 2021 6:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:46:08.775Z: Workers have started successfully.
    Aug 05, 2021 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:46:37.369Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2021 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:46:37.518Z: Cleaning up.
    Aug 05, 2021 6:46:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:46:37.597Z: Stopping worker pool...
    Aug 05, 2021 6:48:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:48:56.119Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 05, 2021 6:48:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T18:48:56.166Z: Worker pool stopped.
    Aug 05, 2021 6:49:03 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-05_11_44_57-5929571103769694781 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 83c591f3-c89d-4014-93ae-033e85ea5b49 and timestamp: 2021-08-05T18:49:03.838000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.524

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2021 6:49:04 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 5 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.011 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.005 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 27.123 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 44s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/2bewmdi5s65my

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2264

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2264/display/redirect?page=changes>

Changes:

[Etienne Chauchot] [BEAM-12591] Put Spark Structured Streaming runner sources back to main

[Etienne Chauchot] [BEAM-12629] As spark DataSourceV2 is only available for spark 2,

[Etienne Chauchot] [BEAM-12627] Deal with spark Encoders braking change between spark 2 and

[Etienne Chauchot] [BEAM-12591] move SchemaHelpers to correct package

[Etienne Chauchot] [BEAM-8470] Disable wait for termination in a streaming pipeline because

[Etienne Chauchot] [BEAM-12630] Deal with breaking change in streaming pipelines start by

[Etienne Chauchot] [BEAM-12629] Make source tests spark version agnostic and move them back

[Etienne Chauchot] [BEAM-12629] Make a spark 3 source impl

[Etienne Chauchot] [BEAM-12591] Fix checkstyle and spotless

[Etienne Chauchot] [BEAM-12629] Reduce serializable to only needed classes and Fix schema

[Etienne Chauchot] [BEAM-12591] Add checkstyle exceptions for version specific classes

[Etienne Chauchot] [BEAM-12629] Fix sources javadocs and improve impl

[Etienne Chauchot] [BEAM-12591] Add spark 3 to structured streaming validates runner tests

[noreply] [BEAM-6516] Fixes race condition in RabbitMqIO causing duplicate acks


------------------------------------------
[...truncated 355.13 KB...]
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 05, 2021 12:45:18 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 05, 2021 12:45:18 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 05, 2021 12:45:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 05, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115864 bytes, hash 1b7eda3a071ec23cdf1d54b5d9f0eb42a9b3380f91a746db8b37e91313106ea4> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-G37aOgcewjzfHVS12fDrQqmzOA-Rp0bbizfpExMQbqQ.pb
    Aug 05, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 05, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 05, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test5213171920567842883.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-6vm5Txh9PP0wqhHbEqdfmZc-K66HxA5wo2gNuc37Brs.jar
    Aug 05, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-YjcXWNInX9ekye2Ilinimy8QNJZBoZtCQNEtODTeKIs.jar
    Aug 05, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.33.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.33.0-SNAPSHOT-tests-z5Sh9dWo0DAeG5w5NaSrkooLfsJ26c7yIXDik9SzDm8.jar
    Aug 05, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/thrift/build/libs/beam-sdks-java-io-thrift-2.33.0-SNAPSHOT-tests.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-thrift-2.33.0-SNAPSHOT-tests-I-osQrG-lSEES6RD4ibuovIeVl-_U3nOEEsBLo1rYrA.jar
    Aug 05, 2021 12:45:25 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 244 files cached, 4 files newly uploaded in 0 seconds
    Aug 05, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 05, 2021 12:45:25 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 05, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 05, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 05, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 05, 2021 12:45:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 05, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-05_05_45_25-2918618145126024327?project=apache-beam-testing
    Aug 05, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-05_05_45_25-2918618145126024327
    Aug 05, 2021 12:45:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-05_05_45_25-2918618145126024327
    Aug 05, 2021 12:45:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-05T12:45:29.066Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 05, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:45:35.564Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 05, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:45:36.261Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 05, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:45:36.317Z: Expanding GroupByKey operations into optimizable parts.
    Aug 05, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:45:36.348Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 05, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:45:36.410Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 05, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:45:36.448Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 05, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:45:36.483Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 05, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:45:36.510Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 05, 2021 12:45:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:45:36.885Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2021 12:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:45:36.976Z: Starting 5 workers in us-central1-c...
    Aug 05, 2021 12:45:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:45:57.614Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 05, 2021 12:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:46:16.368Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 05, 2021 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:46:43.359Z: Workers have started successfully.
    Aug 05, 2021 12:46:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:46:43.423Z: Workers have started successfully.
    Aug 05, 2021 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:47:12.912Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2021 12:47:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:47:13.077Z: Cleaning up.
    Aug 05, 2021 12:47:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:47:13.150Z: Stopping worker pool...
    Aug 05, 2021 12:49:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:49:39.839Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 05, 2021 12:49:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T12:49:39.904Z: Worker pool stopped.
    Aug 05, 2021 12:49:45 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-05_05_45_25-2918618145126024327 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): c7e99327-1572-43d4-86e7-71937e6b8141 and timestamp: 2021-08-05T12:49:45.486000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.862

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2021 12:49:45 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 4,5,main]) completed. Took 4 mins 36.652 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 25s
152 actionable tasks: 101 executed, 51 from cache

Publishing build scan...
https://gradle.com/s/i4y474hum3qv4

Stopped 2 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2263

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2263/display/redirect>

Changes:


------------------------------------------
[...truncated 348.37 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 05, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 05, 2021 6:45:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 05, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 05, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 30d49427c8e17256b83a26c957dd76a739cb64be3991878e56ef10a262a0b779> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-MNSUJ8jhcla4OibJV912pznLZL45kYeOVu8QomKgt3k.pb
    Aug 05, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 05, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4244867247851315189.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-gd9_AZQRsJCADUyc2XeROa78gFRY7AXZKQIpn-jJg5w.jar
    Aug 05, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 05, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 05, 2021 6:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 05, 2021 6:45:06 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 05, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 05, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 05, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 05, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 05, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-04_23_45_07-1165296239272345412?project=apache-beam-testing
    Aug 05, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-04_23_45_07-1165296239272345412
    Aug 05, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-04_23_45_07-1165296239272345412
    Aug 05, 2021 6:45:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-05T06:45:10.797Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 05, 2021 6:45:18 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:17.934Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 05, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:18.610Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 05, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:18.644Z: Expanding GroupByKey operations into optimizable parts.
    Aug 05, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:18.682Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 05, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:18.770Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 05, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:18.803Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 05, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:18.828Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 05, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:18.849Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 05, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:19.214Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:19.296Z: Starting 5 workers in us-central1-c...
    Aug 05, 2021 6:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:45.769Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 05, 2021 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:57.446Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 05, 2021 6:45:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:45:57.476Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 05, 2021 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:46:07.698Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 05, 2021 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:46:32.495Z: Workers have started successfully.
    Aug 05, 2021 6:46:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:46:32.519Z: Workers have started successfully.
    Aug 05, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:47:02.262Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:47:02.402Z: Cleaning up.
    Aug 05, 2021 6:47:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:47:02.481Z: Stopping worker pool...
    Aug 05, 2021 6:49:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:49:20.402Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 05, 2021 6:49:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T06:49:20.450Z: Worker pool stopped.
    Aug 05, 2021 6:49:26 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-04_23_45_07-1165296239272345412 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 353e8d0d-9f73-4f1e-98a6-b5b557cb7461 and timestamp: 2021-08-05T06:49:26.853000000Z:
                     Metric:                    Value:
                   read_time                     8.633
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2021 6:49:27 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 36.233 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 6s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/3sfya4xvegcp2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2262

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2262/display/redirect?page=changes>

Changes:

[ajamato] [BEAM-12670] Relocate bq client exception imports to try block and


------------------------------------------
[...truncated 347.43 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 05, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 05, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 05, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 05, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 05, 2021 12:44:58 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 05, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 05, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 05, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash c1fb893ae696c8fcfe0026c3a1b11b01c1ccaaa11c2a65af751f052a99104eb0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-wfuJOuaWyPz-ACbDobEbAcHMqqEcKmWvdR8FKpkQTrA.pb
    Aug 05, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 05, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 05, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2779920348811177472.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-I1Jli4aJmsqROKQ29NP8M1zek9hn64QiYJmJzfEUv4U.jar
    Aug 05, 2021 12:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 05, 2021 12:45:04 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 05, 2021 12:45:04 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 05, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 05, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 05, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 05, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 05, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-04_17_45_05-11458610852937432649?project=apache-beam-testing
    Aug 05, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-04_17_45_05-11458610852937432649
    Aug 05, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-04_17_45_05-11458610852937432649
    Aug 05, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-05T00:45:09.514Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 05, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:45:18.359Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 05, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:45:19.108Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 05, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:45:19.138Z: Expanding GroupByKey operations into optimizable parts.
    Aug 05, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:45:19.210Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 05, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:45:19.312Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 05, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:45:19.355Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 05, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:45:19.402Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 05, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:45:19.433Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 05, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:45:19.785Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2021 12:45:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:45:19.885Z: Starting 5 workers in us-central1-b...
    Aug 05, 2021 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:45:46.165Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 05, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:46:06.589Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 05, 2021 12:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:46:30.843Z: Workers have started successfully.
    Aug 05, 2021 12:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:46:30.883Z: Workers have started successfully.
    Aug 05, 2021 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:47:01.180Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 05, 2021 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:47:01.378Z: Cleaning up.
    Aug 05, 2021 12:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:47:01.472Z: Stopping worker pool...
    Aug 05, 2021 12:49:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:49:20.468Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 05, 2021 12:49:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-05T00:49:20.508Z: Worker pool stopped.
    Aug 05, 2021 12:49:26 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-04_17_45_05-11458610852937432649 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): bf733bf3-a73e-4d5f-8d08-132d41584b21 and timestamp: 2021-08-05T00:49:26.866000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.094

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 05, 2021 12:49:27 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 38.92 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 8s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/ccbyqzeqcbn6a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2261

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2261/display/redirect?page=changes>

Changes:

[dpires] [BEAM-12715] Use shard number specified by user in SnowflakeIO batch

[relax] fix GroupIntoBatches

[noreply] [BEAM-12703] Fix universal metrics. (#15260)

[noreply] [BEAM-12702] Pull step unique names from pipeline for metrics. (#15261)

[noreply] [BEAM-12678] Add dependency of java jars when running go VR on portable

[noreply] [BEAM-12671] Mark known composite transforms native (#15236)


------------------------------------------
[...truncated 350.50 KB...]
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2021 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2021 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2021 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2021 6:45:22 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 04, 2021 6:45:22 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 04, 2021 6:45:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 04, 2021 6:45:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 04, 2021 6:45:30 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash a6d6fd6eae98c297556c82f6672b55b4afe002fefc2450673978f8ee02c51daf> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-ptb9bq6YwpdVbIL2ZytVtK_gAv78JFBnOXj47gLFHa8.pb
    Aug 04, 2021 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4234040914649143496.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-LfWpLxE56EpXu1InwMnQ6lvZHARb97iAkQThOgCiHF8.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.cloud/google-cloud-bigtable-emulator/0.125.2/e2c4eccdc638e5883b658a222b99a318a817f3c6/google-cloud-bigtable-emulator-0.125.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/google-cloud-bigtable-emulator-0.125.2-FiUK-2Jw2KpBfAi4-J15Ft5rFwkLvGw0DsE7fz_A75M.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/testcontainers/1.15.1/91e6dfab8f141f77c6a0dd147a94bd186993a22c/testcontainers-1.15.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/testcontainers-1.15.1-_fMrLpRXSowMnAISRc73bMk5UqrXNnsv6vkPXOZRXu0.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-api/3.2.7/81408fc988c229ea11354fee9902c47842343f04/docker-java-api-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-api-3.2.7-AwtXDacySf3gUHoixVjjNqW0wUI0YcGJjMO-N7kxBho.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.testcontainers/kafka/1.15.1/ccd374ca7e5d8b06fb31ecffeb03abc42feada84/kafka-1.15.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/kafka-1.15.1-Z4qZ3jdWXDvdB7xD9VivkdVQiRZWNWUhPSIANVCSZDA.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.mongo/2.2.0/781d14f4e3d9eeb0b4c3e00a4ec165a04b3dc5c4/de.flapdoodle.embed.mongo-2.2.0.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.mongo-2.2.0-vNy3lJC0jW9u4Cy1AHsqSbjRUqOTX9ycpEmHkht7vvk.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/de.flapdoodle.embed/de.flapdoodle.embed.process/2.1.2/986b38302fa10018d5baccee7f655c31ee9afe5b/de.flapdoodle.embed.process-2.1.2.jar to gs://temp-storage-for-perf-tests/loadtests/staging/de.flapdoodle.embed.process-2.1.2-OasY7D5KRAimcZcWcjFwgi8Qb4B-iff1FfrVeNSih6A.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport-zerodep/3.2.7/d74cb05fabc57e9ec441dc39d0bc58ad649fad3d/docker-java-transport-zerodep-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-zerodep-3.2.7-O2a9rUb-899yDZMu4E8j9ItFOkNoggqbLP2xaLmOwJY.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.beam/beam-vendor-calcite-1_20_0/0.1/6d16a59dc771784789116607a04acd9a0ef91d16/beam-vendor-calcite-1_20_0-0.1.jar to gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-calcite-1_20_0-0.1-1NrX_9FNKiEqNk5qBOaRlj-IwqOvKvQIGIbTVgm_v8Y.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-csv/1.8/37ca9a9aa2d4be2599e55506a6d3170dd7a3df4/commons-csv-1.8.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-csv-1.8-qL1WZS7UZo2dWjOZSuUvWbnjnI6w68tmhOaK7udXmmE.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/janino/3.0.11/e699e368095379ba0402ea1780a87fcaea16e68f/janino-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/janino-3.0.11-kje3HSMpGA5ZIQ6aqhAO4xNFTvCuWIYIx1yxkxlZG-E.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/org.codehaus.janino/commons-compiler/3.0.11/f2a6ec7fbc929c9fc87ff8bb486c0574951c5b09/commons-compiler-3.0.11.jar to gs://temp-storage-for-perf-tests/loadtests/staging/commons-compiler-3.0.11-DxpPXyZccBoxkzJErnBF_O8YtPpZUEF-Je5wvlDd2s8.jar
    Aug 04, 2021 6:45:33 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/modules-2/files-2.1/com.github.docker-java/docker-java-transport/3.2.7/315903a129f530422747efc163dd255f0fa2555e/docker-java-transport-3.2.7.jar to gs://temp-storage-for-perf-tests/loadtests/staging/docker-java-transport-3.2.7-ynNXlSXhDSyiLCXkA4c9E9_HGeKB7-wbpEzmswFrI-A.jar
    Aug 04, 2021 6:45:36 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 235 files cached, 13 files newly uploaded in 3 seconds
    Aug 04, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 04, 2021 6:45:36 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 04, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 04, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 04, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 04, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 04, 2021 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-04_11_45_36-10697877293879154604?project=apache-beam-testing
    Aug 04, 2021 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-04_11_45_36-10697877293879154604
    Aug 04, 2021 6:45:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-04_11_45_36-10697877293879154604
    Aug 04, 2021 6:45:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-04T18:45:40.291Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 04, 2021 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:45:47.318Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 04, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:45:48.284Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 04, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:45:48.367Z: Expanding GroupByKey operations into optimizable parts.
    Aug 04, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:45:48.419Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 04, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:45:48.498Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 04, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:45:48.534Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 04, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:45:48.559Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 04, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:45:48.587Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 04, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:45:48.992Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:45:49.065Z: Starting 5 workers in us-central1-a...
    Aug 04, 2021 6:45:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:45:55.206Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 04, 2021 6:46:29 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:46:28.688Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 04, 2021 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:46:53.088Z: Workers have started successfully.
    Aug 04, 2021 6:46:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:46:53.108Z: Workers have started successfully.
    Aug 04, 2021 6:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:47:25.288Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2021 6:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:47:25.433Z: Cleaning up.
    Aug 04, 2021 6:47:26 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:47:25.516Z: Stopping worker pool...
    Aug 04, 2021 6:49:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:49:44.919Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 04, 2021 6:49:45 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T18:49:44.996Z: Worker pool stopped.
    Aug 04, 2021 6:49:51 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-04_11_45_36-10697877293879154604 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f8fb9234-3a08-4cec-b6cf-e79210a7093f and timestamp: 2021-08-04T18:49:51.772000000Z:
                     Metric:                    Value:
                   read_time                    10.561
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2021 6:49:52 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 49.394 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 30s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/4yjcyrg2bsthc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2260

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2260/display/redirect>

Changes:


------------------------------------------
[...truncated 348.06 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1266814209]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 04, 2021 12:45:04 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 04, 2021 12:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 04, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 04, 2021 12:45:09 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115861 bytes, hash d45620cdd595ecbdc74929d3a37460fdcdba31096108021c1c04829ea2f6d090> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-1FYgzdWV7L3HSSnTo3Rg_c26MQlhCAIcHASCnqL20JA.pb
    Aug 04, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 04, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 04, 2021 12:45:11 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test7977138305895242566.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-lTgS_7qFAf5LwcCOnHdqB9pGIyHF_b0tiUu7EdPcXbE.jar
    Aug 04, 2021 12:45:12 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 04, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 04, 2021 12:45:12 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 04, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 04, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 04, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 04, 2021 12:45:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 04, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-04_05_45_12-10537848289109745345?project=apache-beam-testing
    Aug 04, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-04_05_45_12-10537848289109745345
    Aug 04, 2021 12:45:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-04_05_45_12-10537848289109745345
    Aug 04, 2021 12:45:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-04T12:45:16.147Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 04, 2021 12:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:45:22.087Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:45:22.808Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:45:22.861Z: Expanding GroupByKey operations into optimizable parts.
    Aug 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:45:22.889Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:45:22.967Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:45:22.996Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:45:23.030Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:45:23.064Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:45:23.406Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2021 12:45:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:45:23.473Z: Starting 5 workers in us-central1-a...
    Aug 04, 2021 12:45:52 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:45:50.440Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 04, 2021 12:46:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:46:02.266Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 04, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:46:27.892Z: Workers have started successfully.
    Aug 04, 2021 12:46:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:46:27.920Z: Workers have started successfully.
    Aug 04, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:46:55.464Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:46:55.629Z: Cleaning up.
    Aug 04, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:46:55.713Z: Stopping worker pool...
    Aug 04, 2021 12:49:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:49:12.786Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 04, 2021 12:49:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T12:49:12.834Z: Worker pool stopped.
    Aug 04, 2021 12:49:18 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-04_05_45_12-10537848289109745345 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 670cd844-3de7-4bf0-9ae7-03b4391d7895 and timestamp: 2021-08-04T12:49:18.380000000Z:
                     Metric:                    Value:
                   read_time                     8.739
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2021 12:49:18 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 24.873 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 58s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/gsaaai5htxpma

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2259

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2259/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #15272:[BEAM-12712] Exclude runners that can't handle


------------------------------------------
[...truncated 348.19 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2021 6:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 04, 2021 6:45:00 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 04, 2021 6:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 04, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 04, 2021 6:45:04 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115923 bytes, hash fa39a485a15b65201bd441140259320afc503eff2703a40af12936ef68d3fccc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--jmkhaFbZSAb1EEUAlkyCvxQPv8nA6QK8Sk272jT_Mw.pb
    Aug 04, 2021 6:45:06 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 04, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 04, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test4517734209561827740.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-3oc7gY_yRTQrDtImOWsRo214hyIQSE8vUy2v9PBaX_4.jar
    Aug 04, 2021 6:45:07 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 04, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 04, 2021 6:45:07 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 04, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 04, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 04, 2021 6:45:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 04, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 04, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-03_23_45_08-12985536090788945813?project=apache-beam-testing
    Aug 04, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-03_23_45_08-12985536090788945813
    Aug 04, 2021 6:45:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-03_23_45_08-12985536090788945813
    Aug 04, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-04T06:45:11.481Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 04, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:45:18.771Z: Worker configuration: e2-standard-2 in us-central1-b.
    Aug 04, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:45:19.336Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 04, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:45:19.374Z: Expanding GroupByKey operations into optimizable parts.
    Aug 04, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:45:19.401Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 04, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:45:19.458Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 04, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:45:19.485Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 04, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:45:19.505Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 04, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:45:19.540Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 04, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:45:19.868Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2021 6:45:20 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:45:19.974Z: Starting 5 workers in us-central1-b...
    Aug 04, 2021 6:45:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:45:28.496Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 04, 2021 6:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:46:05.872Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 04, 2021 6:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:46:30.668Z: Workers have started successfully.
    Aug 04, 2021 6:46:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:46:30.700Z: Workers have started successfully.
    Aug 04, 2021 6:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:47:01.097Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2021 6:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:47:01.237Z: Cleaning up.
    Aug 04, 2021 6:47:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:47:01.325Z: Stopping worker pool...
    Aug 04, 2021 6:49:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:49:21.746Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 04, 2021 6:49:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T06:49:21.775Z: Worker pool stopped.
    Aug 04, 2021 6:49:29 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-03_23_45_08-12985536090788945813 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): b448b93c-9ba3-4c7d-bd48-5aa3045c9551 and timestamp: 2021-08-04T06:49:29.302000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.221

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2021 6:49:29 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.026 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.03 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 4 mins 39.93 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 11s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/hgt63alj5edbm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2258

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2258/display/redirect?page=changes>

Changes:

[noreply] [BEAM-11088] Add TestStream package to Go SDK testing capabilities


------------------------------------------
[...truncated 347.10 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 04, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 04, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 04, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 04, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 04, 2021 12:44:59 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 04, 2021 12:45:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 04, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 04, 2021 12:45:03 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash 0a488c89668794a105a8b5f2fd5a1fba43641a314f30128e5296bb75c357e9d5> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-CkiMiWaHlKEFqLXy_VofukNkGjFPMBKOUpa7dcNX6dU.pb
    Aug 04, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 04, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1005247220021276552.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-g8IGneSo7SOpwt4VZUu2PjAokqEuYo27k48eWtPVRwQ.jar
    Aug 04, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 04, 2021 12:45:05 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 04, 2021 12:45:05 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 04, 2021 12:45:05 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 04, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 04, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 04, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 04, 2021 12:45:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 04, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-03_17_45_06-1756608502890285365?project=apache-beam-testing
    Aug 04, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-03_17_45_06-1756608502890285365
    Aug 04, 2021 12:45:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-03_17_45_06-1756608502890285365
    Aug 04, 2021 12:45:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-04T00:45:10.097Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 04, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:45:17.723Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 04, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:45:18.581Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 04, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:45:18.621Z: Expanding GroupByKey operations into optimizable parts.
    Aug 04, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:45:18.666Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 04, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:45:18.757Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 04, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:45:18.795Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 04, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:45:18.828Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 04, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:45:18.850Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 04, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:45:19.197Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2021 12:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:45:19.287Z: Starting 5 workers in us-central1-c...
    Aug 04, 2021 12:45:28 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:45:28.575Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 04, 2021 12:46:08 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:46:08.316Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 04, 2021 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:46:34.627Z: Workers have started successfully.
    Aug 04, 2021 12:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:46:34.667Z: Workers have started successfully.
    Aug 04, 2021 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:47:09.518Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 04, 2021 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:47:09.713Z: Cleaning up.
    Aug 04, 2021 12:47:10 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:47:09.799Z: Stopping worker pool...
    Aug 04, 2021 12:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:49:24.784Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 04, 2021 12:49:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-04T00:49:24.944Z: Worker pool stopped.
    Aug 04, 2021 12:49:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-03_17_45_06-1756608502890285365 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 3d2101d5-56ea-4355-b26d-b578547a30a1 and timestamp: 2021-08-04T00:49:32.791000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                    10.426

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 04, 2021 12:49:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 7,5,main]) completed. Took 4 mins 43.873 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 14s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/7y3iczzormfiy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2257

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2257/display/redirect?page=changes>

Changes:

[sayat.satybaldiyev] Fix issue with update schema source format

[noreply] [BEAM-12696] Update NewTaggedExternal to not require inputs (#15266)


------------------------------------------
[...truncated 350.47 KB...]
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2021 6:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2021 6:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2021 6:45:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2021 6:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2021 6:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2021 6:45:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2021 6:45:34 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 03, 2021 6:45:34 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 03, 2021 6:45:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 03, 2021 6:45:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 03, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 30d06f19a495f9204d714281bdaff4f45c869d6d2598266f36a914861a3ac159> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-MNBvGaSV-SBNcUKBva_09FyGnW0lmCZvNqkUhho6wVk.pb
    Aug 03, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 03, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 03, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar to gs://temp-storage-for-perf-tests/loadtests/staging/gradle-worker-hnF98-r6gKNOJilVQ_Tbv0exd1AswRoujBz1qsILD_E.jar
    Aug 03, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test496819071283751911.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-gr8ikIUTcT5zk-QSC3xVX3qdkKeWCUDdLNA9mBzXlJ8.jar
    Aug 03, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 246 files cached, 2 files newly uploaded in 0 seconds
    Aug 03, 2021 6:45:41 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 03, 2021 6:45:41 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 03, 2021 6:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 03, 2021 6:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 03, 2021 6:45:42 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 03, 2021 6:45:42 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 03, 2021 6:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-03_11_45_42-10608621573875943835?project=apache-beam-testing
    Aug 03, 2021 6:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-03_11_45_42-10608621573875943835
    Aug 03, 2021 6:45:43 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-03_11_45_42-10608621573875943835
    Aug 03, 2021 6:45:47 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-03T18:45:45.914Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 03, 2021 6:45:53 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:45:53.468Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 03, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:45:54.084Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 03, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:45:54.125Z: Expanding GroupByKey operations into optimizable parts.
    Aug 03, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:45:54.143Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 03, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:45:54.218Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 03, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:45:54.256Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 03, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:45:54.288Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 03, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:45:54.331Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 03, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:45:54.695Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2021 6:45:56 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:45:54.758Z: Starting 5 workers in us-central1-c...
    Aug 03, 2021 6:46:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:46:21.886Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 03, 2021 6:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:46:34.556Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 03, 2021 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:47:01.160Z: Workers have started successfully.
    Aug 03, 2021 6:47:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:47:01.192Z: Workers have started successfully.
    Aug 03, 2021 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:47:32.288Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2021 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:47:32.439Z: Cleaning up.
    Aug 03, 2021 6:47:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:47:32.533Z: Stopping worker pool...
    Aug 03, 2021 6:49:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:49:57.150Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 03, 2021 6:49:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T18:49:57.202Z: Worker pool stopped.
    Aug 03, 2021 6:50:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-03_11_45_42-10608621573875943835 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 6a56bdf7-67ea-42f0-bf28-6bb3907985d6 and timestamp: 2021-08-03T18:50:02.419000000Z:
                     Metric:                    Value:
                   read_time                     8.924
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2021 6:50:02 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 3,5,main]) completed. Took 4 mins 39.868 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 39s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/ysxharennyxvi

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2256

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2256/display/redirect>

Changes:


------------------------------------------
[...truncated 347.99 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 03, 2021 12:45:01 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 03, 2021 12:45:02 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 03, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 03, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash d0ea907130341ebdc85f8b93913e37f49fb954488291fed6371caa13e2bf5e83> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-0OqQcTA0Hr3IX4uTkT439J-5VEiCkf7WNxyqE-K_XoM.pb
    Aug 03, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 03, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3221488191936847682.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test--WXBezzBRhKZ2xszee7hvBMpBOCqQ0_GcA_bJzy5mBQ.jar
    Aug 03, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 03, 2021 12:45:08 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 03, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 03, 2021 12:45:09 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 03, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 03, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 03, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 03, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 03, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-03_05_45_09-1276660137008568513?project=apache-beam-testing
    Aug 03, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-03_05_45_09-1276660137008568513
    Aug 03, 2021 12:45:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-03_05_45_09-1276660137008568513
    Aug 03, 2021 12:45:14 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-03T12:45:12.931Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 03, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:45:17.057Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 03, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:45:17.783Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 03, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:45:17.823Z: Expanding GroupByKey operations into optimizable parts.
    Aug 03, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:45:17.850Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 03, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:45:17.929Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 03, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:45:17.957Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 03, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:45:17.991Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 03, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:45:18.022Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 03, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:45:18.327Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2021 12:45:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:45:18.402Z: Starting 5 workers in us-central1-a...
    Aug 03, 2021 12:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:45:41.141Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 03, 2021 12:46:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:46:02.807Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 03, 2021 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:46:33.423Z: Workers have started successfully.
    Aug 03, 2021 12:46:34 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:46:33.453Z: Workers have started successfully.
    Aug 03, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:47:03.465Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:47:03.616Z: Cleaning up.
    Aug 03, 2021 12:47:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:47:03.696Z: Stopping worker pool...
    Aug 03, 2021 12:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:49:22.379Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 03, 2021 12:49:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T12:49:22.417Z: Worker pool stopped.
    Aug 03, 2021 12:49:29 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-03_05_45_09-1276660137008568513 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f00c4e42-02d9-4519-8383-cbf440b42b5f and timestamp: 2021-08-03T12:49:29.885000000Z:
                     Metric:                    Value:
                   read_time                     9.083
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2021 12:49:30 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 38.522 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 10s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/a274yzixeqpog

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2255

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2255/display/redirect>

Changes:


------------------------------------------
[...truncated 357.98 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2021 6:46:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2021 6:46:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2021 6:46:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2021 6:46:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2021 6:46:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2021 6:46:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2021 6:46:18 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 03, 2021 6:46:18 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 03, 2021 6:46:19 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 03, 2021 6:46:21 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 03, 2021 6:46:22 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115932 bytes, hash dc774f2f1545bf3ded0aa35de327e8188f577d9ba76c647644a7806e6e72308d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3HdPLxVFvz3tCqNd4yfoGI9XfZunbGR2RKeAbm5yMI0.pb
    Aug 03, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 03, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test8073755685613273230.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2sLzr4Tln4-VY55daPlydns75iI6DkAUIXjNNnrtZi0.jar
    Aug 03, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 03, 2021 6:46:24 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 03, 2021 6:46:24 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 03, 2021 6:46:24 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 03, 2021 6:46:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 03, 2021 6:46:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 03, 2021 6:46:25 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 03, 2021 6:46:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 03, 2021 6:46:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-02_23_46_25-8375722086399284082?project=apache-beam-testing
    Aug 03, 2021 6:46:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-02_23_46_25-8375722086399284082
    Aug 03, 2021 6:46:25 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-02_23_46_25-8375722086399284082
    Aug 03, 2021 6:46:29 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-03T06:46:28.503Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 03, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:46:34.167Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 03, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:46:34.712Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 03, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:46:34.753Z: Expanding GroupByKey operations into optimizable parts.
    Aug 03, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:46:34.779Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 03, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:46:34.846Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 03, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:46:34.878Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 03, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:46:34.917Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 03, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:46:34.946Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 03, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:46:35.291Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2021 6:46:35 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:46:35.363Z: Starting 5 workers in us-central1-a...
    Aug 03, 2021 6:46:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:46:49.479Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 03, 2021 6:47:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:47:15.451Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 03, 2021 6:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:47:40.349Z: Workers have started successfully.
    Aug 03, 2021 6:47:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:47:40.379Z: Workers have started successfully.
    Aug 03, 2021 6:48:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:48:09.514Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2021 6:48:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:48:09.659Z: Cleaning up.
    Aug 03, 2021 6:48:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:48:09.746Z: Stopping worker pool...
    Aug 03, 2021 6:50:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:50:27.008Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 03, 2021 6:50:27 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T06:50:27.049Z: Worker pool stopped.
    Aug 03, 2021 6:50:32 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-02_23_46_25-8375722086399284082 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): f04e84cd-a4a7-4df7-aecd-be6f4a5c4810 and timestamp: 2021-08-03T06:50:32.890000000Z:
                     Metric:                    Value:
                   read_time                     9.025
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2021 6:50:33 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 4 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 24.726 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 15s
152 actionable tasks: 104 executed, 48 from cache

Publishing build scan...
https://gradle.com/s/ht4bjmb5nzipc

Stopped 3 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2254

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2254/display/redirect>

Changes:


------------------------------------------
[...truncated 364.25 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 03, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 03, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 03, 2021 12:47:34 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 03, 2021 12:47:35 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 03, 2021 12:47:35 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 03, 2021 12:47:35 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 03, 2021 12:47:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 03, 2021 12:47:40 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 2677b94ab9944ef00619e63e272a5b136da1964f866b7701785106028b3a8e33> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Jne5SrmUTvAGGeY-JypbE22hlk-Ga3cBeFEGAos6jjM.pb
    Aug 03, 2021 12:47:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 03, 2021 12:47:42 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 03, 2021 12:47:42 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test1968627374119434336.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-T2LXtFhPbh0PsB0jmgkdB7LGHpgQ-jTSqb2DVSkKI0M.jar
    Aug 03, 2021 12:47:42 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 03, 2021 12:47:42 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 03, 2021 12:47:42 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 03, 2021 12:47:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 03, 2021 12:47:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 03, 2021 12:47:43 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 03, 2021 12:47:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 03, 2021 12:47:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-02_17_47_43-5688273734682124745?project=apache-beam-testing
    Aug 03, 2021 12:47:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-02_17_47_43-5688273734682124745
    Aug 03, 2021 12:47:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-02_17_47_43-5688273734682124745
    Aug 03, 2021 12:47:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-03T00:47:46.793Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 03, 2021 12:47:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:47:54.022Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 03, 2021 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:47:54.891Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 03, 2021 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:47:54.921Z: Expanding GroupByKey operations into optimizable parts.
    Aug 03, 2021 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:47:54.948Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 03, 2021 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:47:55.002Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 03, 2021 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:47:55.037Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 03, 2021 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:47:55.057Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 03, 2021 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:47:55.091Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 03, 2021 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:47:55.392Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2021 12:47:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:47:55.465Z: Starting 5 workers in us-central1-c...
    Aug 03, 2021 12:48:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:48:09.626Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 03, 2021 12:48:37 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:48:36.215Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 03, 2021 12:49:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:49:01.980Z: Workers have started successfully.
    Aug 03, 2021 12:49:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:49:02.016Z: Workers have started successfully.
    Aug 03, 2021 12:49:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:49:34.102Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 03, 2021 12:49:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:49:34.307Z: Cleaning up.
    Aug 03, 2021 12:49:34 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:49:34.400Z: Stopping worker pool...
    Aug 03, 2021 12:51:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:51:51.930Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 03, 2021 12:51:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-03T00:51:51.980Z: Worker pool stopped.
    Aug 03, 2021 12:51:59 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-02_17_47_43-5688273734682124745 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 1dc95eef-9e2f-4df9-bfdb-ff0302d86fdd and timestamp: 2021-08-03T00:51:59.153000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     8.801

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 03, 2021 12:51:59 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 6 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 8,5,main]) completed. Took 4 mins 33.864 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 37s
152 actionable tasks: 110 executed, 42 from cache

Publishing build scan...
https://gradle.com/s/skpgxxqpnmaoc

Stopped 5 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2253

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2253/display/redirect?page=changes>

Changes:

[relax] Ensure timer consistency in Dataflow and portable runners


------------------------------------------
[...truncated 349.90 KB...]
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1974477711]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2021 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2021 6:45:27 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2021 6:45:28 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2021 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2021 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2021 6:45:28 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2021 6:45:28 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 02, 2021 6:45:28 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 02, 2021 6:45:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 02, 2021 6:45:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 02, 2021 6:45:32 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash 9cd5b07b300f120a8af0f8ecd15c5a3fbc5f5ed99fd51efff173cc77cc4996af> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-nNWwezAPEgqK8Pjs0VxaP7xfXtmf1R7_8XPMd8xJlq8.pb
    Aug 02, 2021 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 02, 2021 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-axv3FHHZicpIQdrQ0TNeMX81FJ7x6KPAVlxA_bVHogU.jar
    Aug 02, 2021 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test3731090438028020295.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-tuQOFR5OaUSHc1AsUQE0W8HkwCun14I3WXSI8gE_t-k.jar
    Aug 02, 2021 6:45:35 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 02, 2021 6:45:35 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 02, 2021 6:45:35 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 02, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 02, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 02, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 02, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 02, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-02_11_45_36-17162785159161696569?project=apache-beam-testing
    Aug 02, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-02_11_45_36-17162785159161696569
    Aug 02, 2021 6:45:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-02_11_45_36-17162785159161696569
    Aug 02, 2021 6:45:41 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-02T18:45:39.791Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 02, 2021 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:45:46.867Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 02, 2021 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:45:48.517Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 02, 2021 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:45:48.562Z: Expanding GroupByKey operations into optimizable parts.
    Aug 02, 2021 6:45:48 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:45:48.602Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 02, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:45:48.695Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 02, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:45:48.737Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 02, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:45:48.777Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 02, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:45:48.801Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 02, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:45:49.234Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2021 6:45:50 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:45:49.313Z: Starting 5 workers in us-central1-a...
    Aug 02, 2021 6:46:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:46:14.133Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 02, 2021 6:46:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:46:38.364Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 02, 2021 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:47:08.214Z: Workers have started successfully.
    Aug 02, 2021 6:47:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:47:08.264Z: Workers have started successfully.
    Aug 02, 2021 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:47:36.963Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2021 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:47:37.130Z: Cleaning up.
    Aug 02, 2021 6:47:37 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:47:37.238Z: Stopping worker pool...
    Aug 02, 2021 6:50:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:50:00.352Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 02, 2021 6:50:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T18:50:00.394Z: Worker pool stopped.
    Aug 02, 2021 6:50:07 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-02_11_45_36-17162785159161696569 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): d57b1298-335d-49a5-8195-a215601ed71b and timestamp: 2021-08-02T18:50:07.458000000Z:
                     Metric:                    Value:
                   read_time                     8.263
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2021 6:50:08 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 4 mins 50.33 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 44s
152 actionable tasks: 98 executed, 54 from cache

Publishing build scan...
https://gradle.com/s/lueeybkdtdc5q

Stopped 1 worker daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2252

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2252/display/redirect>

Changes:


------------------------------------------
[...truncated 347.45 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1361861382]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 02, 2021 12:44:59 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 02, 2021 12:45:00 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 02, 2021 12:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 02, 2021 12:45:03 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115862 bytes, hash 6e1d72260eeff6af1aad82feaab9f4890472164677239f6d51d453d9db158dcc> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-bh1yJg7v9q8arYL-qrn0iQRyFkZ3I59tUdRT2dsVjcw.pb
    Aug 02, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 02, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-8dj8jYMMHXszIcz-CHqEr7RZVD7HTRzv2xQ3RMYUFJg.jar
    Aug 02, 2021 12:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test6488459192298177254.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-2G3aYw8_hinB4PQwNgm4wPRnSpg-W_lu-JzVM_FaEiA.jar
    Aug 02, 2021 12:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 02, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 02, 2021 12:45:06 PM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 02, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 02, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 02, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 02, 2021 12:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 02, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-02_05_45_07-16574221391885423918?project=apache-beam-testing
    Aug 02, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-02_05_45_07-16574221391885423918
    Aug 02, 2021 12:45:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-02_05_45_07-16574221391885423918
    Aug 02, 2021 12:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-02T12:45:11.854Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 02, 2021 12:45:16 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:45:16.626Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:45:17.219Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:45:17.260Z: Expanding GroupByKey operations into optimizable parts.
    Aug 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:45:17.288Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:45:17.355Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:45:17.385Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:45:17.418Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:45:17.453Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:45:17.755Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2021 12:45:19 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:45:17.823Z: Starting 5 workers in us-central1-a...
    Aug 02, 2021 12:45:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:45:37.768Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 02, 2021 12:46:03 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:46:02.507Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 02, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:46:26.677Z: Workers have started successfully.
    Aug 02, 2021 12:46:27 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:46:26.708Z: Workers have started successfully.
    Aug 02, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:46:55.692Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:46:55.834Z: Cleaning up.
    Aug 02, 2021 12:46:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:46:55.918Z: Stopping worker pool...
    Aug 02, 2021 12:49:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:49:09.661Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 02, 2021 12:49:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T12:49:09.723Z: Worker pool stopped.
    Aug 02, 2021 12:49:16 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-02_05_45_07-16574221391885423918 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 397813c5-c9fe-4acc-ac38-d93cc8392390 and timestamp: 2021-08-02T12:49:16.477000000Z:
                     Metric:                    Value:
                   read_time                     9.318
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2021 12:49:17 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.023 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) completed. Took 4 mins 26.128 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 55s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/p2z3lgoslfd4o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2251

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2251/display/redirect>

Changes:


------------------------------------------
[...truncated 347.37 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@993420861]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 02, 2021 6:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 02, 2021 6:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 02, 2021 6:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 02, 2021 6:44:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash c8e6b91377f4a1dd508d4be318dc17a686d42a86163b15c6a5a666a412bec63c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-yOa5E3f0od1QjUvjGNwXpobUKoYWOxXGpaZmpBK-xjw.pb
    Aug 02, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 02, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-8dj8jYMMHXszIcz-CHqEr7RZVD7HTRzv2xQ3RMYUFJg.jar
    Aug 02, 2021 6:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test2223973170490881855.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-JAqUezHkp9ubYSa7b5majc3d8pRZ7Jf9g_q1zXP3ilE.jar
    Aug 02, 2021 6:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 1 seconds
    Aug 02, 2021 6:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 02, 2021 6:45:02 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 02, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 02, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 02, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 02, 2021 6:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-01_23_45_03-8935888078845273155?project=apache-beam-testing
    Aug 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-01_23_45_03-8935888078845273155
    Aug 02, 2021 6:45:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-01_23_45_03-8935888078845273155
    Aug 02, 2021 6:45:09 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-02T06:45:06.778Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 02, 2021 6:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:45:12.646Z: Worker configuration: e2-standard-2 in us-central1-a.
    Aug 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:45:13.340Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:45:13.378Z: Expanding GroupByKey operations into optimizable parts.
    Aug 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:45:13.413Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:45:13.505Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:45:13.540Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:45:13.573Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:45:13.606Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:45:13.962Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2021 6:45:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:45:14.022Z: Starting 5 workers in us-central1-a...
    Aug 02, 2021 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:45:23.040Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 02, 2021 6:46:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:46:05.384Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 02, 2021 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:46:30.908Z: Workers have started successfully.
    Aug 02, 2021 6:46:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:46:30.934Z: Workers have started successfully.
    Aug 02, 2021 6:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:47:00.397Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2021 6:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:47:00.532Z: Cleaning up.
    Aug 02, 2021 6:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:47:00.611Z: Stopping worker pool...
    Aug 02, 2021 6:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:49:12.790Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 02, 2021 6:49:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T06:49:12.823Z: Worker pool stopped.
    Aug 02, 2021 6:49:19 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-01_23_45_03-8935888078845273155 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 194fff04-45e5-4312-9763-eb6ccda7e5c7 and timestamp: 2021-08-02T06:49:19.538000000Z:
                     Metric:                    Value:
                   read_time                     9.933
                 fields_read                 4375276.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2021 6:49:19 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 4 mins 33.057 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 2s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/pkmsucxg4itk6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_SQLBigQueryIO_Batch_Performance_Test_Java #2250

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2250/display/redirect>

Changes:


------------------------------------------
[...truncated 348.06 KB...]
      Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called.
        at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286)
        at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
        at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
        at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
        at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2021 12:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2021 12:44:55 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2
    Aug 02, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 02, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 02, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 02, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]])

    Aug 02, 2021 12:44:56 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 'job') AND `score` > 2
    Aug 02, 2021 12:44:56 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services.
    Aug 02, 2021 12:44:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 02, 2021 12:44:59 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <115924 bytes, hash 9666d33c5d19902b5d3e58e9ec37972a1e2ebedf0adf25b4b36b485df1cd998d> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-lmbTPF0ZkCtdPljp7DeXKh4uvt8K3yW0s2tIXfHNmY0.pb
    Aug 02, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Uploading 248 files from PipelineOptions.filesToStage to staging location to prepare for execution.
    Aug 02, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading /tmp/test96883393863423210.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-1MBBxRWpFxezXdwiO7e9smbkNJoDzaB39L3nNdPeabs.jar
    Aug 02, 2021 12:45:01 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage
    INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.33.0-SNAPSHOT-8dj8jYMMHXszIcz-CHqEr7RZVD7HTRzv2xQ3RMYUFJg.jar
    Aug 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements
    INFO: Staging files complete: 247 files cached, 1 files newly uploaded in 0 seconds
    Aug 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 02, 2021 12:45:02 AM io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference cleanQueue
    SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=13, target=bigquerystorage.googleapis.com:443} was not shutdown properly!!! ~*~*~*
        Make sure to call shutdown()/shutdownNow() and wait until awaitTermination() returns true.
    java.lang.RuntimeException: ManagedChannel allocation site
    	at io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:93)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:53)
    	at io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:44)
    	at io.grpc.internal.ManagedChannelImplBuilder.build(ManagedChannelImplBuilder.java:615)
    	at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:261)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createSingleChannel(InstantiatingGrpcChannelProvider.java:327)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.access$1700(InstantiatingGrpcChannelProvider.java:74)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider$1.createSingleChannel(InstantiatingGrpcChannelProvider.java:220)
    	at com.google.api.gax.grpc.ChannelPool.create(ChannelPool.java:72)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.createChannel(InstantiatingGrpcChannelProvider.java:227)
    	at com.google.api.gax.grpc.InstantiatingGrpcChannelProvider.getTransportChannel(InstantiatingGrpcChannelProvider.java:210)
    	at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:169)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.GrpcBigQueryWriteStub.create(GrpcBigQueryWriteStub.java:138)
    	at com.google.cloud.bigquery.storage.v1beta2.stub.BigQueryWriteStubSettings.createStub(BigQueryWriteStubSettings.java:145)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.<init>(BigQueryWriteClient.java:128)
    	at com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient.create(BigQueryWriteClient.java:109)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.newBigQueryWriteClient(BigQueryServicesImpl.java:1264)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.access$800(BigQueryServicesImpl.java:139)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:510)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl$DatasetServiceImpl.<init>(BigQueryServicesImpl.java:455)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryServicesImpl.getDatasetService(BigQueryServicesImpl.java:172)
    	at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$TypedRead.validate(BigQueryIO.java:973)
    	at org.apache.beam.sdk.Pipeline$ValidateVisitor.enterCompositeTransform(Pipeline.java:662)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:581)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
    	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
    	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
    	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:469)
    	at org.apache.beam.sdk.Pipeline.validate(Pipeline.java:598)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:322)
    	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:309)
    	at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethodPushDown(BigQueryIOPushDownIT.java:136)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
    	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
    	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
    	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
    	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
    	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
    	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
    	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
    	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
    	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
    	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
    	at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
    	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
    	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
    	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
    	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
    	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
    	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
    	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
    	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
    	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
    	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
    	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
    	at java.lang.Thread.run(Thread.java:748)

    Aug 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 02, 2021 12:45:02 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.33.0-SNAPSHOT
    Aug 02, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-08-01_17_45_03-5878873550893983361?project=apache-beam-testing
    Aug 02, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-08-01_17_45_03-5878873550893983361
    Aug 02, 2021 12:45:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-08-01_17_45_03-5878873550893983361
    Aug 02, 2021 12:45:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-08-02T00:45:06.277Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 02, 2021 12:45:13 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:13.212Z: Worker configuration: e2-standard-2 in us-central1-c.
    Aug 02, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:13.754Z: Expanding CoGroupByKey operations into optimizable parts.
    Aug 02, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:13.806Z: Expanding GroupByKey operations into optimizable parts.
    Aug 02, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:13.847Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
    Aug 02, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:13.924Z: Fusing adjacent ParDo, Read, Write, and Flatten operations
    Aug 02, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:13.957Z: Fusing consumer BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)
    Aug 02, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:13.990Z: Fusing consumer BeamCalcRel_285/ParDo(Calc) into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
    Aug 02, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:14.022Z: Fusing consumer ParDo(TimeMonitor) into BeamCalcRel_285/ParDo(Calc)
    Aug 02, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:14.404Z: Executing operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2021 12:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:14.490Z: Starting 5 workers in us-central1-c...
    Aug 02, 2021 12:45:32 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:31.693Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 02, 2021 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:46.228Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s).
    Aug 02, 2021 12:45:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:46.258Z: Resized worker pool to 1, though goal was 5.  This could be a quota issue.
    Aug 02, 2021 12:45:58 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:45:56.562Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s).
    Aug 02, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:46:21.133Z: Workers have started successfully.
    Aug 02, 2021 12:46:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:46:21.171Z: Workers have started successfully.
    Aug 02, 2021 12:46:50 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:46:50.574Z: Finished operation BeamPushDownIOSourceRel_229/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
    Aug 02, 2021 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:46:50.740Z: Cleaning up.
    Aug 02, 2021 12:46:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:46:50.822Z: Stopping worker pool...
    Aug 02, 2021 12:49:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:49:05.939Z: Autoscaling: Resized worker pool from 5 to 0.
    Aug 02, 2021 12:49:06 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-08-02T00:49:05.992Z: Worker pool stopped.
    Aug 02, 2021 12:49:11 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-08-01_17_45_03-5878873550893983361 finished with status DONE.

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 30ecd1c8-62ea-43ab-8e39-844a33b15e84 and timestamp: 2021-08-02T00:49:11.552000000Z:
                     Metric:                    Value:
                 fields_read                 4375276.0
                   read_time                     9.147

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 02, 2021 12:49:11 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.025 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 2,5,main]) completed. Took 4 mins 25.167 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 54s
152 actionable tasks: 96 executed, 56 from cache

Publishing build scan...
https://gradle.com/s/obswr44nkq3v4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org