You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@drill.apache.org by "john schneider (JIRA)" <ji...@apache.org> on 2015/12/03 21:13:11 UTC

[jira] [Created] (DRILL-4158) Using CASE statement with heterogeneous schemas, NullPointers or returns Nonrelated error

john schneider created DRILL-4158:
-------------------------------------

             Summary: Using CASE statement with heterogeneous schemas, NullPointers or returns Nonrelated error
                 Key: DRILL-4158
                 URL: https://issues.apache.org/jira/browse/DRILL-4158
             Project: Apache Drill
          Issue Type: Bug
    Affects Versions: 1.3.0
         Environment: running locally on MAC OS X and remotely on Centos 7
            Reporter: john schneider


I'm trying to use case statements to manage a heterogeneous stream of json objects as 
shown in the example from  https://drill.apache.org/blog/2015/11/23/drill-1.3-released/
trying to get this to work I have encounted a number of failure modes.

For this report there are two files casetest-1.json and casetest-2.json

casetest-1.json has the following two lines - there are two schemas represented by
the two lines
{"level":"EVENT","time":1448844983160,"user_info":{"session":"9OOLJ8HEGEQ0sTCVSXsK9ddJWVpFM5wM","user":"ndagdagan_apex@apixio.com"}}
{"level":"EVENT","time":1448844983160,"user_info":{"session":"9OOLJ8HEGEQ0sTCVSXsK9ddJWVpFM5wM","user":{"id":"ndagdagan_apex@apixio.com","roles":null,"isNotadmins":true,"iscoders":true}}}

casetest-2.json has the following two lines - there is just one schema represented
{"level":"EVENT","time":1448844983160,"user_info":{"session":"9OOLJ8HEGEQ0sTCVSXsK9ddJWVpFM5wM","user":{"id":"ndagdagan_apex@apixio.com","roles":null,"isNotadmins":true,"iscoders":true}}}
{"level":"EVENT","time":1448844983160,"user_info":{"session":"9OOLJ8HEGEQ0sTCVSXsK9ddJWVpFM5wM","user":{"id":"ndagdagan_apex@apixio.com","roles":null,"isNotadmins":true,"iscoders":true}}}


I'll outline all of the things I've tried and include stacktraces where appropriate

Prior to running tests, I "enable_union_type"
0: jdbc:drill:zk=local> ALTER SESSION SET `exec.enable_union_type` = true;
+-------+----------------------------------+
|  ok   |             summary              |
+-------+----------------------------------+
| true  | exec.enable_union_type updated.  |
+-------+----------------------------------+
1 row selected (1.344 seconds)


## FIRST TEST, two lines two schemas, one with a field that's a string and second field is a map
## first lets just select all records, I expect this to barf since there are two schemas
## WORKS AS EXPECTED for this version of software

: jdbc:drill:zk=local> select *  from dfs.`/Users/jos/work/drill/casetest-1.json` t ;
Error: DATA_READ ERROR: Error parsing JSON - You tried to start when you are using a ValueWriter of type NullableVarCharWriterImpl.

File  /Users/jos/work/drill/casetest-1.json
Record  2
Fragment 0:0

[Error Id: 1385aea5-68cb-4775-ae17-fad6b4901ea6 on 10.0.1.9:31010] (state=,code=0)

## SECOND TEST, now lets use a case statement to sort out the schemas, I don't
## expect this to barf but barf it does and hard w/ a NULLPOINTER, THIS SHOULD HAVE WORKED
## StackTrace at bottom

0: jdbc:drill:zk=local> select case when is_map(t.user_info.`user`) then 'map' else 'string' end from dfs.`/Users/jos/work/drill/casetest-1.json` t ;
Error: SYSTEM ERROR: NullPointerException

Fragment 0:0

[Error Id: 2568df13-d11a-402d-9cc9-82bd0ddb3c50 on 10.19.220.63:31010] (state=,code=0)
0: jdbc:drill:zk=local>


## THRID TEST now lets see if any case will work on any structure - for this we
## will use casetest-2.json which has just one schema
## select * WORKS AS EXPECTED

0: jdbc:drill:zk=local> select * from dfs.`/Users/jos/work/drill/casetest-2.json` t ;
+-------+------+-----------+
| level | time | user_info |
+-------+------+-----------+
| EVENT | 1448844983160 | {"session":"9OOLJ8HEGEQ0sTCVSXsK9ddJWVpFM5wM","user":{"id":"ndagdagan_apex@apixio.com","isNotadmins":true,"iscoders":true}} |
| EVENT | 1448844983160 | {"session":"9OOLJ8HEGEQ0sTCVSXsK9ddJWVpFM5wM","user":{"id":"ndagdagan_apex@apixio.com","isNotadmins":true,"iscoders":true}} |
+-------+------+-----------+
2 rows selected (1.701 seconds)

## FOURTH TEST - now lets try to use a  case statement w/ same schema
## THIS SHOULD HAVE WORKED, it doesn't work, and we get different more puzzling errors

0: jdbc:drill:zk=local> select case when is_map(t.user_info.`user`) then 'map' else 'string' end  from dfs.`/Users/jos/work/drill/casetest-2.json` t ;
Error: SYSTEM ERROR: SchemaChangeException: Failure while trying to materialize incoming schema.  Errors:

Error in expression at index -1.  Error: Missing function implementation: [is_map(MAP-REQUIRED)].  Full expression: --UNKNOWN EXPRESSION--.
Error in expression at index -1.  Error: Failure composing If Expression.  All conditions must return a boolean type.  Condition was of Type NULL..  Full expression: --UNKNOWN EXPRESSION--..

Fragment 0:0

[Error Id: c3a7f989-4d93-48c0-9a16-a38dd195314c on 10.19.220.63:31010] (state=,code=0)
0: jdbc:drill:zk=local>


================ Stack trace from TEST 2 ===========================

Error: SYSTEM ERROR: NullPointerException

Fragment 0:0

[Error Id: 9daa2496-d774-47b6-b786-014aac9abe59 on 10.19.220.63:31010] (state=,code=0)


[Error Id: 9daa2496-d774-47b6-b786-014aac9abe59 on 10.19.220.63:31010]
	at org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:534) ~[drill-common-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.work.fragment.FragmentExecutor.sendFinalState(FragmentExecutor.java:321) [drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.work.fragment.FragmentExecutor.cleanup(FragmentExecutor.java:184) [drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.work.fragment.FragmentExecutor.run(FragmentExecutor.java:290) [drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38) [drill-common-1.3.0.jar:1.3.0]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_51]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_51]
	at java.lang.Thread.run(Thread.java:745) [na:1.8.0_51]
Caused by: java.lang.NullPointerException: null
	at org.apache.drill.exec.vector.complex.UnionVector.getFieldIdIfMatches(UnionVector.java:729) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.vector.complex.FieldIdUtil.getFieldIdIfMatches(FieldIdUtil.java:95) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.vector.complex.AbstractContainerVector.getFieldIdIfMatches(AbstractContainerVector.java:114) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.record.SimpleVectorWrapper.getFieldIdIfMatches(SimpleVectorWrapper.java:146) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.record.VectorContainer.getValueVectorId(VectorContainer.java:252) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.physical.impl.ScanBatch.getValueVectorId(ScanBatch.java:307) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.expr.ExpressionTreeMaterializer$MaterializeVisitor.visitSchemaPath(ExpressionTreeMaterializer.java:628) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.expr.ExpressionTreeMaterializer$MaterializeVisitor.visitSchemaPath(ExpressionTreeMaterializer.java:217) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.common.expression.SchemaPath.accept(SchemaPath.java:152) ~[drill-common-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.expr.ExpressionTreeMaterializer$MaterializeVisitor.visitFunctionCall(ExpressionTreeMaterializer.java:274) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.expr.ExpressionTreeMaterializer$MaterializeVisitor.visitFunctionCall(ExpressionTreeMaterializer.java:217) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.common.expression.FunctionCall.accept(FunctionCall.java:60) ~[drill-common-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.expr.ExpressionTreeMaterializer$MaterializeVisitor.visitIfExpression(ExpressionTreeMaterializer.java:494) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.expr.ExpressionTreeMaterializer$MaterializeVisitor.visitIfExpression(ExpressionTreeMaterializer.java:217) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.common.expression.IfExpression.accept(IfExpression.java:64) ~[drill-common-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.expr.ExpressionTreeMaterializer.materialize(ExpressionTreeMaterializer.java:120) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.setupNewSchema(ProjectRecordBatch.java:386) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.record.AbstractSingleRecordBatch.innerNext(AbstractSingleRecordBatch.java:78) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:131) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:156) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:104) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:80) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:94) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.work.fragment.FragmentExecutor$1.run(FragmentExecutor.java:256) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.work.fragment.FragmentExecutor$1.run(FragmentExecutor.java:250) ~[drill-java-exec-1.3.0.jar:1.3.0]
	at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_51]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[na:1.8.0_51]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) ~[hadoop-common-2.7.1.jar:na]
	at org.apache.drill.exec.work.fragment.FragmentExecutor.run(FragmentExecutor.java:250) [drill-java-exec-1.3.0.jar:1.3.0]
	... 4 common frames omitted
2015-12-01 10:36:15,231 [CONTROL-rpc-event-queue] WARN  o.a.drill.exec.work.foreman.Foreman - Dropping request to move to COMPLETED state as query is already at FAILED state (which is terminal).
2015-12-01 10:36:15,232 [CONTROL-rpc-event-queue] WARN  o.a.d.e.w.b.ControlMessageHandler - Dropping request to cancel fragment. 29a2175f-d3f2-caf9-2b51-12754264abe9:0:0 does not exist.
2015-12-01 10:36:15,234 [USER-rpc-event-queue] INFO  o.a.d.j.i.DrillResultSetImpl$ResultsListener - [#7] Query failed: 
org.apache.drill.common.exceptions.UserRemoteException: SYSTEM ERROR: NullPointerException

Fragment 0:0

[Error Id: 9daa2496-d774-47b6-b786-014aac9abe59 on 10.19.220.63:31010]
	at org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:118) [drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.rpc.user.UserClient.handleReponse(UserClient.java:112) [drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:47) [drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.rpc.BasicClientWithConnection.handle(BasicClientWithConnection.java:32) [drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.rpc.RpcBus.handle(RpcBus.java:69) [drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.rpc.RpcBus$RequestEvent.run(RpcBus.java:400) [drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.common.SerializedExecutor$RunnableProcessor.run(SerializedExecutor.java:105) [drill-common-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.rpc.RpcBus$SameExecutor.execute(RpcBus.java:264) [drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.common.SerializedExecutor.execute(SerializedExecutor.java:142) [drill-common-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:298) [drill-java-exec-1.3.0.jar:1.3.0]
	at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:269) [drill-java-exec-1.3.0.jar:1.3.0]
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89) [netty-codec-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254) [netty-handler-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242) [netty-codec-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) [netty-transport-4.0.27.Final.jar:4.0.27.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) [netty-common-4.0.27.Final.jar:4.0.27.Final]
	at java.lang.Thread.run(Thread.java:745) [na:1.8.0_51]



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)