You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2022/06/04 15:37:37 UTC

[GitHub] [beam] damccorm opened a new issue, #20173: UNION ALL with double IllegalStateException

damccorm opened a new issue, #20173:
URL: https://github.com/apache/beam/issues/20173

   Three failures in shard 18, Three failures in shard 37
   ```
   
   java.lang.IllegalStateException: the keyCoder of a GroupByKey must be deterministic
   	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:234)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
   	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at
   org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
   	at
   org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:118)
   	at org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:71)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:476)
   	at
   org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.apply(KeyedPCollectionTuple.java:108)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:96)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:41)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
   	at
   cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl.executeQuery(ExecuteQueryServiceServer.java:288)
   	at
   com.google.zetasql.testing.SqlComplianceServiceGrpc$MethodHandlers.invoke(SqlComplianceServiceGrpc.java:423)
   	at
   com.google.zetasql.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:171)
   	at
   com.google.zetasql.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:283)
   	at
   com.google.zetasql.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:711)
   	at
   com.google.zetasql.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
   	at com.google.zetasql.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
   	at
   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at
   java.lang.Thread.run(Thread.java:748)
    
   ```
   
   1:
   ```
   
   Apr 01, 2020 3:55:55 PM cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl executeQuery
   INFO:
   Processing Sql statement: SELECT double_val FROM Table1 UNION ALL SELECT double_val FROM Table2
   Apr
   01, 2020 3:55:55 PM cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl executeQuery
   SEVERE:
   !!!!!the keyCoder of a GroupByKey must be deterministic
   java.lang.IllegalStateException: the keyCoder
   of a GroupByKey must be deterministic
   	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:234)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
   	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at
   org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
   	at
   org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:118)
   	at org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:71)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:476)
   	at
   org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.apply(KeyedPCollectionTuple.java:108)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:96)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:41)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
   	at
   cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl.executeQuery(ExecuteQueryServiceServer.java:288)
   	at
   com.google.zetasql.testing.SqlComplianceServiceGrpc$MethodHandlers.invoke(SqlComplianceServiceGrpc.java:423)
   	at
   com.google.zetasql.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:171)
   	at
   com.google.zetasql.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:283)
   	at
   com.google.zetasql.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:711)
   	at
   com.google.zetasql.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
   	at com.google.zetasql.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
   	at
   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at
   java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   SchemaCoder<Schema: Fields:
   Field{name=double_val, description=, type=FieldType{typeName=DOUBLE, nullable=true,
   logicalType=null, collectionElementType=null, mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}},
   options={{}}}
   Options:{{}}  UUID: 9896a9ff-6b85-423e-bbd0-392f0461e87b delegateCoder: org.apache.beam.sdk.coders.Coder$ByteBuddy$lYxrgbBO@b517475
   is not deterministic because:
   	All fields must have deterministic encoding
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:196)
   	at
   org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:138)
   	at org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:126)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:232)
   	... 25 more
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   NullableCoder(DoubleCoder) is not deterministic because:
   	Value coder must be deterministic
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:196)
   	at
   org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:207)
   	at org.apache.beam.sdk.coders.NullableCoder.verifyDeterministic(NullableCoder.java:109)
   	at
   org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:194)
   	... 28 more
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   DoubleCoder is not deterministic because:
   	Floating point encodings are not guaranteed to be deterministic.
   	at
   org.apache.beam.sdk.coders.DoubleCoder.verifyDeterministic(DoubleCoder.java:71)
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:194)
   	...
   31 more
   
   ```
   
   2:
   ```
   
   Apr 01, 2020 3:55:59 PM cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl executeQuery
   INFO:
   Processing Sql statement: SELECT bool_val, double_val, int64_val, str_val FROM Table1
   UNION ALL
   SELECT
   bool_val, double_val, int64_val, str_val FROM Table2
   Apr 01, 2020 3:55:59 PM cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl
   executeQuery
   SEVERE: !!!!!the keyCoder of a GroupByKey must be deterministic
   java.lang.IllegalStateException:
   the keyCoder of a GroupByKey must be deterministic
   	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:234)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
   	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at
   org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
   	at
   org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:118)
   	at org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:71)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:476)
   	at
   org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.apply(KeyedPCollectionTuple.java:108)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:96)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:41)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
   	at
   cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl.executeQuery(ExecuteQueryServiceServer.java:288)
   	at
   com.google.zetasql.testing.SqlComplianceServiceGrpc$MethodHandlers.invoke(SqlComplianceServiceGrpc.java:423)
   	at
   com.google.zetasql.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:171)
   	at
   com.google.zetasql.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:283)
   	at
   com.google.zetasql.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:711)
   	at
   com.google.zetasql.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
   	at com.google.zetasql.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
   	at
   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at
   java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   SchemaCoder<Schema: Fields:
   Field{name=bool_val, description=, type=FieldType{typeName=BOOLEAN, nullable=true,
   logicalType=null, collectionElementType=null, mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}},
   options={{}}}
   Field{name=double_val, description=, type=FieldType{typeName=DOUBLE, nullable=true, logicalType=null,
   collectionElementType=null, mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}}, options={{}}}
   Field{name=int64_val,
   description=, type=FieldType{typeName=INT64, nullable=true, logicalType=null, collectionElementType=null,
   mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}}, options={{}}}
   Field{name=str_val,
   description=, type=FieldType{typeName=STRING, nullable=true, logicalType=null, collectionElementType=null,
   mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}}, options={{}}}
   Options:{{}}  UUID:
   dc5cfc6f-96bd-4181-8f13-a9186fc8cdf9 delegateCoder: org.apache.beam.sdk.coders.Coder$ByteBuddy$tPVlD9JI@7c4b1367
   is not deterministic because:
   	All fields must have deterministic encoding
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:196)
   	at
   org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:138)
   	at org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:126)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:232)
   	... 25 more
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   NullableCoder(DoubleCoder) is not deterministic because:
   	Value coder must be deterministic
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:196)
   	at
   org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:207)
   	at org.apache.beam.sdk.coders.NullableCoder.verifyDeterministic(NullableCoder.java:109)
   	at
   org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:194)
   	... 28 more
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   DoubleCoder is not deterministic because:
   	Floating point encodings are not guaranteed to be deterministic.
   	at
   org.apache.beam.sdk.coders.DoubleCoder.verifyDeterministic(DoubleCoder.java:71)
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:194)
   	...
   31 more
   
   ```
   
   
   ```
   
   INFO: Processing Sql statement: SELECT bool_val, double_val, int64_val, str_val
   FROM Table1
       
    UNION ALL SELECT bool_val, double_val, int64_val, str_val FROM Table2
         UNION ALL SELECT bool_val,
   double_val, int64_val, str_val FROM Table3
         UNION ALL SELECT bool_val, double_val, int64_val,
   str_val FROM Table2
         UNION ALL SELECT bool_val, double_val, int64_val, str_val FROM Table1
   Apr
   01, 2020 3:56:02 PM cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl executeQuery
   SEVERE:
   !!!!!the keyCoder of a GroupByKey must be deterministic
   java.lang.IllegalStateException: the keyCoder
   of a GroupByKey must be deterministic
   	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:234)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
   	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at
   org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
   	at
   org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:118)
   	at org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:71)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:476)
   	at
   org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.apply(KeyedPCollectionTuple.java:108)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:96)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:41)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
   	at
   cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl.executeQuery(ExecuteQueryServiceServer.java:288)
   	at
   com.google.zetasql.testing.SqlComplianceServiceGrpc$MethodHandlers.invoke(SqlComplianceServiceGrpc.java:423)
   	at
   com.google.zetasql.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:171)
   	at
   com.google.zetasql.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:283)
   	at
   com.google.zetasql.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:711)
   	at
   com.google.zetasql.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
   	at com.google.zetasql.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
   	at
   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at
   java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   SchemaCoder<Schema: Fields:
   Field{name=bool_val, description=, type=FieldType{typeName=BOOLEAN, nullable=true,
   logicalType=null, collectionElementType=null, mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}},
   options={{}}}
   Field{name=double_val, description=, type=FieldType{typeName=DOUBLE, nullable=true, logicalType=null,
   collectionElementType=null, mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}}, options={{}}}
   Field{name=int64_val,
   description=, type=FieldType{typeName=INT64, nullable=true, logicalType=null, collectionElementType=null,
   mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}}, options={{}}}
   Field{name=str_val,
   description=, type=FieldType{typeName=STRING, nullable=true, logicalType=null, collectionElementType=null,
   mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}}, options={{}}}
   Options:{{}}  UUID:
   ab4f56bd-668d-404f-9c65-e14ac260abd9 delegateCoder: org.apache.beam.sdk.coders.Coder$ByteBuddy$JOGrmQb2@7f63a8f5
   is not deterministic because:
   	All fields must have deterministic encoding
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:196)
   	at
   org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:138)
   	at org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:126)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:232)
   	... 58 more
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   NullableCoder(DoubleCoder) is not deterministic because:
   	Value coder must be deterministic
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:196)
   	at
   org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:207)
   	at org.apache.beam.sdk.coders.NullableCoder.verifyDeterministic(NullableCoder.java:109)
   	at
   org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:194)
   	... 61 more
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   DoubleCoder is not deterministic because:
   	Floating point encodings are not guaranteed to be deterministic.
   	at
   org.apache.beam.sdk.coders.DoubleCoder.verifyDeterministic(DoubleCoder.java:71)
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:194)
   	...
   64 more
   
   ```
   
   
   1:
   ```
   
   Apr 01, 2020 3:54:45 PM cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl executeQuery
   INFO:
   Processing Sql statement: SELECT COUNT(a) FROM (
   SELECT a FROM (SELECT 1.2 a UNION ALL SELECT 2.3 UNION
   ALL SELECT 3.4) LIMIT 1)
   Apr 01, 2020 3:54:46 PM cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl
   executeQuery
   SEVERE: !!!!!the keyCoder of a GroupByKey must be deterministic
   java.lang.IllegalStateException:
   the keyCoder of a GroupByKey must be deterministic
   	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:234)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
   	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at
   org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
   	at
   org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:118)
   	at org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:71)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:476)
   	at
   org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.apply(KeyedPCollectionTuple.java:108)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:96)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:41)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
   	at
   cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl.executeQuery(ExecuteQueryServiceServer.java:288)
   	at
   com.google.zetasql.testing.SqlComplianceServiceGrpc$MethodHandlers.invoke(SqlComplianceServiceGrpc.java:423)
   	at
   com.google.zetasql.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:171)
   	at
   com.google.zetasql.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:283)
   	at
   com.google.zetasql.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:711)
   	at
   com.google.zetasql.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
   	at com.google.zetasql.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
   	at
   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at
   java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   SchemaCoder<Schema: Fields:
   Field{name=a, description=, type=FieldType{typeName=DOUBLE, nullable=false,
   logicalType=null, collectionElementType=null, mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}},
   options={{}}}
   Options:{{}}  UUID: 0291cfcd-dc14-4502-9556-b3e6cf18f158 delegateCoder: org.apache.beam.sdk.coders.Coder$ByteBuddy$ra7xRTOg@2f0a2e40
   is not deterministic because:
   	All fields must have deterministic encoding
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:196)
   	at
   org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:138)
   	at org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:126)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:232)
   	... 69 more
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   DoubleCoder is not deterministic because:
   	Floating point encodings are not guaranteed to be deterministic.
   	at
   org.apache.beam.sdk.coders.DoubleCoder.verifyDeterministic(DoubleCoder.java:71)
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:194)
   	...
   72 more
   
   ```
   
   2:
   ```
   
   Apr 01, 2020 3:54:46 PM cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl executeQuery
   INFO:
   Processing Sql statement: SELECT COUNT(a) FROM (
   SELECT a FROM (SELECT 1.2 a UNION ALL SELECT 2.3 UNION
   ALL SELECT 3.4) LIMIT 2)
   Apr 01, 2020 3:54:46 PM cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl
   executeQuery
   SEVERE: !!!!!the keyCoder of a GroupByKey must be deterministic
   java.lang.IllegalStateException:
   the keyCoder of a GroupByKey must be deterministic
   	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:234)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
   	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at
   org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
   	at
   org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:118)
   	at org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:71)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:476)
   	at
   org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.apply(KeyedPCollectionTuple.java:108)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:96)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:41)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
   	at
   cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl.executeQuery(ExecuteQueryServiceServer.java:288)
   	at
   com.google.zetasql.testing.SqlComplianceServiceGrpc$MethodHandlers.invoke(SqlComplianceServiceGrpc.java:423)
   	at
   com.google.zetasql.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:171)
   	at
   com.google.zetasql.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:283)
   	at
   com.google.zetasql.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:711)
   	at
   com.google.zetasql.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
   	at com.google.zetasql.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
   	at
   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at
   java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   SchemaCoder<Schema: Fields:
   Field{name=a, description=, type=FieldType{typeName=DOUBLE, nullable=false,
   logicalType=null, collectionElementType=null, mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}},
   options={{}}}
   Options:{{}}  UUID: 79708f22-a547-425d-a687-ce7930a6ca19 delegateCoder: org.apache.beam.sdk.coders.Coder$ByteBuddy$Xf02jM4h@61e2b4af
   is not deterministic because:
   	All fields must have deterministic encoding
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:196)
   	at
   org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:138)
   	at org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:126)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:232)
   	... 69 more
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   DoubleCoder is not deterministic because:
   	Floating point encodings are not guaranteed to be deterministic.
   	at
   org.apache.beam.sdk.coders.DoubleCoder.verifyDeterministic(DoubleCoder.java:71)
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:194)
   	...
   72 more
   
   ```
   
   3:
   ```
   
   Apr 01, 2020 3:54:54 PM cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl executeQuery
   INFO:
   Processing Sql statement: SELECT a FROM (SELECT 1.1 a UNION ALL SELECT 2.2 UNION ALL SELECT 3.3) LIMIT
   3 OFFSET 4
   Apr 01, 2020 3:54:54 PM cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl
   executeQuery
   SEVERE: !!!!!the keyCoder of a GroupByKey must be deterministic
   java.lang.IllegalStateException:
   the keyCoder of a GroupByKey must be deterministic
   	at org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:234)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:110)
   	at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at
   org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:368)
   	at
   org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:118)
   	at org.apache.beam.sdk.transforms.join.CoGroupByKey.expand(CoGroupByKey.java:71)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:476)
   	at
   org.apache.beam.sdk.transforms.join.KeyedPCollectionTuple.apply(KeyedPCollectionTuple.java:108)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:96)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSetOperatorRelBase.expand(BeamSetOperatorRelBase.java:41)
   	at
   org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:542)
   	at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:50)
   	at
   java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
   	at java.util.Iterator.forEachRemaining(Iterator.java:116)
   	at
   java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
   	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
   	at
   java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
   	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
   	at
   java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:51)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:67)
   	at
   org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
   	at
   cloud.dataflow.sql.ExecuteQueryServiceServer$SqlComplianceServiceImpl.executeQuery(ExecuteQueryServiceServer.java:288)
   	at
   com.google.zetasql.testing.SqlComplianceServiceGrpc$MethodHandlers.invoke(SqlComplianceServiceGrpc.java:423)
   	at
   com.google.zetasql.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:171)
   	at
   com.google.zetasql.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:283)
   	at
   com.google.zetasql.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:711)
   	at
   com.google.zetasql.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
   	at com.google.zetasql.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
   	at
   java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at
   java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   SchemaCoder<Schema: Fields:
   Field{name=a, description=, type=FieldType{typeName=DOUBLE, nullable=false,
   logicalType=null, collectionElementType=null, mapKeyType=null, mapValueType=null, rowSchema=null, metadata={}},
   options={{}}}
   Options:{{}}  UUID: 225d4da7-2f1b-432f-b0bf-0807c53e9245 delegateCoder: org.apache.beam.sdk.coders.Coder$ByteBuddy$ooowjaL9@56579dd
   is not deterministic because:
   	All fields must have deterministic encoding
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:196)
   	at
   org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:138)
   	at org.apache.beam.sdk.schemas.SchemaCoder.verifyDeterministic(SchemaCoder.java:126)
   	at
   org.apache.beam.sdk.transforms.GroupByKey.expand(GroupByKey.java:232)
   	... 47 more
   Caused by: org.apache.beam.sdk.coders.Coder$NonDeterministicException:
   DoubleCoder is not deterministic because:
   	Floating point encodings are not guaranteed to be deterministic.
   	at
   org.apache.beam.sdk.coders.DoubleCoder.verifyDeterministic(DoubleCoder.java:71)
   	at org.apache.beam.sdk.coders.Coder.verifyDeterministic(Coder.java:194)
   	...
   50 more
   
   ```
   
   
   Imported from Jira [BEAM-9666](https://issues.apache.org/jira/browse/BEAM-9666). Original Jira may contain additional context.
   Reported by: apilloud.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] jainam1995 commented on issue #20173: UNION ALL with double IllegalStateException

Posted by GitBox <gi...@apache.org>.
jainam1995 commented on issue #20173:
URL: https://github.com/apache/beam/issues/20173#issuecomment-1254259673

   Hey Team, We are running into the same error log while trying to perform an UNION ALL . Are there any alternatives to this ? Any pointers how to go ahead with this until there is a fix ? 
   Thanks


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org