You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@drill.apache.org by Charles Givre <cg...@gmail.com> on 2019/01/23 05:17:53 UTC

Regression? Drill Truncating Varchars

Hello all, 
I’m working on a format plugin to read syslog data, and have encountered what seems to be a bit of a regression (maybe).   The code below is a helper function which writes strings from the data.  As of Drill 1.16, the varchar holder seems to throw an error if the string you are trying to write is > 256 characters.  Is there a workaround? 

Thanks!


//Helper function to map strings
private void mapStringField(String name, String value, BaseWriter.MapWriter map) {
  if (value == null) {
    return;
  }
  try {
    byte[] bytes = value.getBytes("UTF-8");
    int stringLength = bytes.length;
    this.buffer.setBytes(0, bytes, 0, stringLength);
    map.varChar(name).writeVarChar(0, stringLength, buffer);
  } catch (Exception e) {
    throw UserException
            .dataWriteError()
            .addContext("Could not write string: ")
            .addContext(e.getMessage())
            .build(logger);
  }
}





Re: Regression? Drill Truncating Varchars

Posted by Charles Givre <cg...@gmail.com>.
Hi Paul, 
Thanks! That’s what I was looking for.  I am still curious as to why this worked with Drill 1.15 and did not with Drill 1.16, but we’ll save that for another day.  Unit tests all pass and PR is ready to go.
Thanks everyone!

> On Jan 26, 2019, at 22:21, Paul Rogers <pa...@yahoo.com.INVALID> wrote:
> 
> Hi Charles,
> 
> A managed buffer is just a DrillBuf that the execution framework will free for you when the query fragment shuts down.
> 
> However, nothing can determine when you write past the end of the buffer and automatically resize it. You still must do the reallocation yourself.
> 
> You probably need something line:
> 
> int strLen = ...
> buffer = buffer.reallocIfNeeded(strLen);
> // Do something with buffer up to strLen bytes
> 
> The above pattern will return the original buffer if it is large enough, else will allocate a new one of the size of the next power of 2 above strLen.
> 
> Thanks,
> - Paul
> 
> 
> 
>    On Saturday, January 26, 2019, 3:51:50 PM PST, Charles Givre <cg...@gmail.com> wrote:  
> 
> HI Karthik, 
> Thanks for your response.  This was working in Drill 1.15 when I submitted the original PR.  I updated the code for Drill 1.16 and suddenly the unit tests started failing and I couldn’t figure out why.  After some digging, I found that the buffer situation was the issue.  I was under the impression that the getManagedBuffer() function would allocate and/or reallocate space as needed but I guess that is not the case.  Did something change with the BufferManagerImpl class for Drill 1.16? 
> 
> In any event, what would the best practice be?  Would you suggest just setting the buffer size to a large value and hope for the best?  
> Thanks!
> — C
> 
> 
> 
> 
>> On Jan 25, 2019, at 20:17, Karthikeyan Manivannan <km...@mapr.com> wrote:
>> 
>> Your buffer is allocated as
>> 
>> this.buffer = context.getManagedBuffer();
>> 
>> and that boils down to
>> 
>> BufferManagerImpl.java
>> 
>> @Override
>> public DrillBuf getManagedBuffer() {
>>   return getManagedBuffer(256);
>> }
>> 
>> This maybe the root of the problem.
>> 
>> Thanks.
>> 
>> Karthik
>> 
>> On Fri, Jan 25, 2019 at 3:02 PM Karthikeyan Manivannan <km...@mapr.com>
>> wrote:
>> 
>>> Hi Charles,
>>> 
>>> In the code that you had pasted
>>> 
>>> this.buffer.setBytes(0, bytes, 0, stringLength);
>>> 
>>> what guarantees that this.buffer has enough space for stringLength chars?
>>> 
>>> Karthik
>>> 
>>> On Fri, Jan 25, 2019 at 2:32 PM Karthikeyan Manivannan <
>>> kmanivannan@mapr.com> wrote:
>>> 
>>>> Charles,
>>>> 
>>>> Does this work on 1.15 ?
>>>> 
>>>> Drill 1.16 is able to correctly read CSV files with > 256 char strings,
>>>> so I guess the problem might be in the Syslog plugin code.
>>>> Can you share your format plugin code ?
>>>> 
>>>> Thanks.
>>>> 
>>>> Karthik
>>>> 
>>>> 
>>>> On Thu, Jan 24, 2019 at 4:24 PM Charles Givre <cg...@gmail.com> wrote:
>>>> 
>>>>> Here you go… Thanks for your help!
>>>>> 
>>>>> 
>>>>> SELECT
>>>>> event_date,severity_code,facility_code,severity,facility,ip,app_name,process_id,message_id,structured_data_text,structured_data_UserAgent,structured_data_UserHostAddress,structured_data_BrowserSession,structured_data_Realm,structured_data_Appliance,structured_data_Company,structured_data_UserID,structured_data_PEN,structured_data_HostName,structured_data_Category,structured_data_Priority,message
>>>>> FROM cp.`syslog/test.syslog1`
>>>>> 18:16:50.061 [main] ERROR org.apache.drill.TestReporter - Test Failed
>>>>> (d: 0 B(1 B), h: 21.6 MiB(130.9 MiB), nh: 2.0 MiB(82.2 MiB)):
>>>>> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>>> java.lang.NullPointerException: null
>>>>>         at
>>>>> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
>>>>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>>>>>         at
>>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
>>>>> ~[test-classes/:na]
>>>>>         at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
>>>>> 18:16:50.571 [main] ERROR org.apache.drill.TestReporter - Test Failed
>>>>> (d: 0 B(1 B), h: 20.5 MiB(209.9 MiB), nh: 411.1 KiB(85.4 MiB)):
>>>>> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>>> java.lang.NullPointerException: null
>>>>>         at
>>>>> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
>>>>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>>>>>         at
>>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
>>>>> ~[test-classes/:na]
>>>>>         at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
>>>>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0, Time elapsed:
>>>>> 3.173 s <<< FAILURE! - in
>>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat
>>>>> [ERROR]
>>>>> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>>> Time elapsed: 0.234 s  <<< ERROR!
>>>>> java.lang.NullPointerException
>>>>>         at
>>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
>>>>> 
>>>>> [ERROR]
>>>>> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>>> Time elapsed: 0.125 s  <<< ERROR!
>>>>> java.lang.NullPointerException
>>>>>         at
>>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
>>>>> 
>>>>> [INFO]
>>>>> [INFO] Results:
>>>>> [INFO]
>>>>> [ERROR] Errors:
>>>>> [ERROR]  TestSyslogFormat.testExplicitFlattenedStructuredDataQuery:310
>>>>> » NullPointer
>>>>> [ERROR]  TestSyslogFormat.testStarFlattenedStructuredDataQuery:248 »
>>>>> NullPointer
>>>>> [INFO]
>>>>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0
>>>>> [INFO]
>>>>> [INFO]
>>>>> ------------------------------------------------------------------------
>>>>> [INFO] BUILD FAILURE
>>>>> [INFO]
>>>>> ------------------------------------------------------------------------
>>>>> [INFO] Total time:  33.919 s
>>>>> [INFO] Finished at: 2019-01-24T18:16:51-05:00
>>>>> [INFO]
>>>>> ------------------------------------------------------------------------
>>>>> [ERROR] Failed to execute goal
>>>>> org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test (default-test)
>>>>> on project drill-format-syslog: There are test failures.
>>>>> [ERROR]
>>>>> [ERROR] Please refer to
>>>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>>>> for the individual test results.
>>>>> [ERROR] Please refer to dump files (if any exist) [date].dump,
>>>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>>>> [ERROR] -> [Help 1]
>>>>> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to
>>>>> execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test
>>>>> (default-test) on project drill-format-syslog: There are test failures.
>>>>> 
>>>>> Please refer to
>>>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>>>> for the individual test results.
>>>>> Please refer to dump files (if any exist) [date].dump,
>>>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>>> (MojoExecutor.java:215)
>>>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>>> (MojoExecutor.java:156)
>>>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>>> (MojoExecutor.java:148)
>>>>>     at
>>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>>> (LifecycleModuleBuilder.java:117)
>>>>>     at
>>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>>> (LifecycleModuleBuilder.java:81)
>>>>>     at
>>>>> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>>>>> (SingleThreadedBuilder.java:56)
>>>>>     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
>>>>> (LifecycleStarter.java:128)
>>>>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>>>>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>>>>>     at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>>>>>     at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>>>>>     at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>>>>>     at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke
>>>>> (NativeMethodAccessorImpl.java:62)
>>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke
>>>>> (DelegatingMethodAccessorImpl.java:43)
>>>>>     at java.lang.reflect.Method.invoke (Method.java:497)
>>>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
>>>>> (Launcher.java:289)
>>>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launch
>>>>> (Launcher.java:229)
>>>>>     at
>>>>> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
>>>>> (Launcher.java:415)
>>>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.main
>>>>> (Launcher.java:356)
>>>>> Caused by: org.apache.maven.plugin.MojoFailureException: There are test
>>>>> failures.
>>>>> 
>>>>> Please refer to
>>>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>>>> for the individual test results.
>>>>> Please refer to dump files (if any exist) [date].dump,
>>>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>>>>     at org.apache.maven.plugin.surefire.SurefireHelper.throwException
>>>>> (SurefireHelper.java:271)
>>>>>     at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution
>>>>> (SurefireHelper.java:159)
>>>>>     at org.apache.maven.plugin.surefire.SurefirePlugin.handleSummary
>>>>> (SurefirePlugin.java:373)
>>>>>     at
>>>>> org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked
>>>>> (AbstractSurefireMojo.java:1016)
>>>>>     at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute
>>>>> (AbstractSurefireMojo.java:846)
>>>>>     at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo
>>>>> (DefaultBuildPluginManager.java:137)
>>>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>>> (MojoExecutor.java:210)
>>>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>>> (MojoExecutor.java:156)
>>>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>>> (MojoExecutor.java:148)
>>>>>     at
>>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>>> (LifecycleModuleBuilder.java:117)
>>>>>     at
>>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>>> (LifecycleModuleBuilder.java:81)
>>>>>     at
>>>>> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>>>>> (SingleThreadedBuilder.java:56)
>>>>>     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
>>>>> (LifecycleStarter.java:128)
>>>>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>>>>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>>>>>     at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>>>>>     at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>>>>>     at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>>>>>     at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke
>>>>> (NativeMethodAccessorImpl.java:62)
>>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke
>>>>> (DelegatingMethodAccessorImpl.java:43)
>>>>>     at java.lang.reflect.Method.invoke (Method.java:497)
>>>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
>>>>> (Launcher.java:289)
>>>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launch
>>>>> (Launcher.java:229)
>>>>>     at
>>>>> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
>>>>> (Launcher.java:415)
>>>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.main
>>>>> (Launcher.java:356)
>>>>> [ERROR]
>>>>> [ERROR]
>>>>> [ERROR] For more information about the errors and possible solutions,
>>>>> please read the following articles:
>>>>> [ERROR] [Help 1]
>>>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__cwiki.apache.org_confluence_display_MAVEN_MojoFailureException&d=DwIFaQ&c=cskdkSMqhcnjZxdQVpwTXg&r=HlugibuI4IVjs-VMnFvNTcaBtEaDDqE4Ya96cugWqJ8&m=FYH3iU2DaKxg6aW__bi0KbNTl6lqB4g5vyB4-lVHbPk&s=dp6wxwSeKJ-a87PHGpnBjn7aOhg4DNJucCXguyddfMQ&e=
>>>>> 
>>>>> 
>>>>> If you run the query in sqline, here is the result:
>>>>> 
>>>>> 
>>>>> jdbc:drill:zk=local> select * from dfs.test.`test.syslog1`;
>>>>> Error: DATA_READ ERROR: Error parsing file
>>>>> 
>>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>>> 
>>>>> Line: 1
>>>>> DATA_WRITE ERROR: null
>>>>> 
>>>>> Could not write string:
>>>>> index: 0, length: 342 (expected: range(0, 256))
>>>>> 
>>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>>> 
>>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>>> Fragment 0:0
>>>>> 
>>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>>> (state=,code=0)
>>>>> java.sql.SQLException: DATA_READ ERROR: Error parsing file
>>>>> 
>>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>>> 
>>>>> Line: 1
>>>>> DATA_WRITE ERROR: null
>>>>> 
>>>>> Could not write string:
>>>>> index: 0, length: 342 (expected: range(0, 256))
>>>>> 
>>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>>> 
>>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>>> Fragment 0:0
>>>>> 
>>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>>>         at
>>>>> org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.java:536)
>>>>>         at
>>>>> org.apache.drill.jdbc.impl.DrillCursor.loadInitialSchema(DrillCursor.java:608)
>>>>>         at
>>>>> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:1288)
>>>>>         at
>>>>> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:61)
>>>>>         at
>>>>> org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
>>>>>         at
>>>>> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1107)
>>>>>         at
>>>>> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1118)
>>>>>         at
>>>>> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
>>>>>         at
>>>>> org.apache.drill.jdbc.impl.DrillConnectionImpl.prepareAndExecuteInternal(DrillConnectionImpl.java:200)
>>>>>         at
>>>>> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
>>>>>         at
>>>>> org.apache.calcite.avatica.AvaticaStatement.execute(AvaticaStatement.java:217)
>>>>>         at sqlline.Commands.execute(Commands.java:938)
>>>>>         at sqlline.Commands.sql(Commands.java:882)
>>>>>         at sqlline.SqlLine.dispatch(SqlLine.java:725)
>>>>>         at sqlline.SqlLine.begin(SqlLine.java:540)
>>>>>         at sqlline.SqlLine.start(SqlLine.java:264)
>>>>>         at sqlline.SqlLine.main(SqlLine.java:195)
>>>>> Caused by: org.apache.drill.common.exceptions.UserRemoteException:
>>>>> DATA_READ ERROR: Error parsing file
>>>>> 
>>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>>> 
>>>>> Line: 1
>>>>> DATA_WRITE ERROR: null
>>>>> 
>>>>> Could not write string:
>>>>> index: 0, length: 342 (expected: range(0, 256))
>>>>> 
>>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>>> 
>>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>>> Fragment 0:0
>>>>> 
>>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>>>         at
>>>>> org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123)
>>>>>         at
>>>>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:422)
>>>>>         at
>>>>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:96)
>>>>>         at
>>>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:273)
>>>>>         at
>>>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:243)
>>>>>         at
>>>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>>         at
>>>>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>>         at
>>>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>>         at
>>>>> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:312)
>>>>>         at
>>>>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:286)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>>         at
>>>>> io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>>         at
>>>>> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>>         at
>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>>         at
>>>>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
>>>>>         at
>>>>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>>>>>         at
>>>>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
>>>>>         at
>>>>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
>>>>>         at
>>>>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
>>>>>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
>>>>>         at
>>>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
>>>>>         at java.lang.Thread.run(Thread.java:745)
>>>>> Caused by: java.lang.Exception: DATA_READ ERROR: Error parsing file
>>>>> 
>>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>>> 
>>>>> Line: 1
>>>>> DATA_WRITE ERROR: null
>>>>> 
>>>>> Could not write string:
>>>>> index: 0, length: 342 (expected: range(0, 256))
>>>>> 
>>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>>> 
>>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>>> Fragment 0:0
>>>>> 
>>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>>>         at
>>>>> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:633)
>>>>>         at
>>>>> org.apache.drill.exec.store.syslog.SyslogRecordReader.next(SyslogRecordReader.java:169)
>>>>>         at
>>>>> org.apache.drill.exec.physical.impl.ScanBatch.internalNext(ScanBatch.java:223)
>>>>>         at
>>>>> org.apache.drill.exec.physical.impl.ScanBatch.next(ScanBatch.java:271)
>>>>>         at
>>>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:126)
>>>>>         at
>>>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:116)
>>>>>         at
>>>>> org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext(AbstractUnaryRecordBatch.java:63)
>>>>>         at
>>>>> org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:143)
>>>>>         at
>>>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:186)
>>>>>         at
>>>>> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:104)
>>>>>         at
>>>>> org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:83)
>>>>>         at
>>>>> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:94)
>>>>>         at org.apache.drill.exec.work
>>>>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:297)
>>>>>         at org.apache.drill.exec.work
>>>>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:284)
>>>>>         at .......(:0)
>>>>>         at
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
>>>>>         at org.apache.drill.exec.work
>>>>> .fragment.FragmentExecutor.run(FragmentExecutor.java:284)
>>>>>         at
>>>>> org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38)
>>>>>         at .......(:0)
>>>>> 
>>>>> 
>>>>> If you run a query w/o long fields it works fine:
>>>>> dbc:drill:zk=local> select * from dfs.test.`logs.syslog1`;
>>>>> 
>>>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>>>> |        event_date        | severity_code  | facility_code  | severity
>>>>> | facility  |          ip          | app_name  | message_id  |
>>>>>         message                    | process_id  |
>>>>>       structured_data_text                              |
>>>>>             structured_data                              |
>>>>> 
>>>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>>>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
>>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>>                                                             | {}
>>>>>                                                             |
>>>>> | 1985-04-12 23:20:50.52  | 2              | 4              | CRIT
>>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>>                                                             | {}
>>>>>                                                             |
>>>>> | 1985-04-12 23:20:50.52  | 2              | 4              | CRIT
>>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>>                                                             | {}
>>>>>                                                             |
>>>>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
>>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>>                                                             | {}
>>>>>                                                             |
>>>>> | 2003-08-24 12:14:15.0    | 2              | 4              | CRIT
>>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>>                                                             | {}
>>>>>                                                             |
>>>>> | 2003-08-24 12:14:15.0    | 5              | 20            | NOTICE
>>>>> | LOCAL4    | 192.0.2.1              | myproc    | null        | %% It's
>>>>> time to make the do-nuts.              | 8710        | null
>>>>>                                                             | {}
>>>>>                                                             |
>>>>> | 2003-10-11 22:14:15.003  | 5              | 20            | NOTICE
>>>>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | null
>>>>>                                         | null        |
>>>>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
>>>>> eventSource=Application, eventID=1011]} |
>>>>> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
>>>>> | 2003-10-11 22:14:15.003  | 5              | 20            | NOTICE
>>>>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | - and
>>>>> thats a wrap!                            | null        |
>>>>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
>>>>> eventSource=Application, eventID=1011]} |
>>>>> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
>>>>> 
>>>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>>>> 8 rows selected (1.702 seconds)
>>>>> 
>>>>> 
>>>>>> On Jan 24, 2019, at 17:02, Karthikeyan Manivannan <
>>>>> kmanivannan@mapr.com> wrote:
>>>>>> 
>>>>>> Hi Charles,
>>>>>> 
>>>>>> Can you please provide the stack trace.
>>>>>> 
>>>>>> Thanks
>>>>>> 
>>>>>> Karthik
>>>>>> 
>>>>>> On Tue, Jan 22, 2019 at 9:18 PM Charles Givre <cg...@gmail.com>
>>>>> wrote:
>>>>>> 
>>>>>>> Hello all,
>>>>>>> I’m working on a format plugin to read syslog data, and have
>>>>> encountered
>>>>>>> what seems to be a bit of a regression (maybe).  The code below is a
>>>>>>> helper function which writes strings from the data.  As of Drill
>>>>> 1.16, the
>>>>>>> varchar holder seems to throw an error if the string you are trying to
>>>>>>> write is > 256 characters.  Is there a workaround?
>>>>>>> 
>>>>>>> Thanks!
>>>>>>> 
>>>>>>> 
>>>>>>> //Helper function to map strings
>>>>>>> private void mapStringField(String name, String value,
>>>>>>> BaseWriter.MapWriter map) {
>>>>>>> if (value == null) {
>>>>>>>   return;
>>>>>>> }
>>>>>>> try {
>>>>>>>   byte[] bytes = value.getBytes("UTF-8");
>>>>>>>   int stringLength = bytes.length;
>>>>>>>   this.buffer.setBytes(0, bytes, 0, stringLength);
>>>>>>>   map.varChar(name).writeVarChar(0, stringLength, buffer);
>>>>>>> } catch (Exception e) {
>>>>>>>   throw UserException
>>>>>>>           .dataWriteError()
>>>>>>>           .addContext("Could not write string: ")
>>>>>>>           .addContext(e.getMessage())
>>>>>>>           .build(logger);
>>>>>>> }
>>>>>>> }
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>> 
>>>>> 


Re: Regression? Drill Truncating Varchars

Posted by Paul Rogers <pa...@yahoo.com.INVALID>.
Hi Charles,

A managed buffer is just a DrillBuf that the execution framework will free for you when the query fragment shuts down.

However, nothing can determine when you write past the end of the buffer and automatically resize it. You still must do the reallocation yourself.

You probably need something line:

int strLen = ...
buffer = buffer.reallocIfNeeded(strLen);
// Do something with buffer up to strLen bytes

The above pattern will return the original buffer if it is large enough, else will allocate a new one of the size of the next power of 2 above strLen.

Thanks,
- Paul

 

    On Saturday, January 26, 2019, 3:51:50 PM PST, Charles Givre <cg...@gmail.com> wrote:  
 
 HI Karthik, 
Thanks for your response.  This was working in Drill 1.15 when I submitted the original PR.  I updated the code for Drill 1.16 and suddenly the unit tests started failing and I couldn’t figure out why.  After some digging, I found that the buffer situation was the issue.  I was under the impression that the getManagedBuffer() function would allocate and/or reallocate space as needed but I guess that is not the case.  Did something change with the BufferManagerImpl class for Drill 1.16? 

In any event, what would the best practice be?  Would you suggest just setting the buffer size to a large value and hope for the best?  
Thanks!
— C




> On Jan 25, 2019, at 20:17, Karthikeyan Manivannan <km...@mapr.com> wrote:
> 
> Your buffer is allocated as
> 
> this.buffer = context.getManagedBuffer();
> 
> and that boils down to
> 
> BufferManagerImpl.java
> 
> @Override
> public DrillBuf getManagedBuffer() {
>  return getManagedBuffer(256);
> }
> 
> This maybe the root of the problem.
> 
> Thanks.
> 
> Karthik
> 
> On Fri, Jan 25, 2019 at 3:02 PM Karthikeyan Manivannan <km...@mapr.com>
> wrote:
> 
>> Hi Charles,
>> 
>> In the code that you had pasted
>> 
>> this.buffer.setBytes(0, bytes, 0, stringLength);
>> 
>> what guarantees that this.buffer has enough space for stringLength chars?
>> 
>> Karthik
>> 
>> On Fri, Jan 25, 2019 at 2:32 PM Karthikeyan Manivannan <
>> kmanivannan@mapr.com> wrote:
>> 
>>> Charles,
>>> 
>>> Does this work on 1.15 ?
>>> 
>>> Drill 1.16 is able to correctly read CSV files with > 256 char strings,
>>> so I guess the problem might be in the Syslog plugin code.
>>> Can you share your format plugin code ?
>>> 
>>> Thanks.
>>> 
>>> Karthik
>>> 
>>> 
>>> On Thu, Jan 24, 2019 at 4:24 PM Charles Givre <cg...@gmail.com> wrote:
>>> 
>>>> Here you go… Thanks for your help!
>>>> 
>>>> 
>>>> SELECT
>>>> event_date,severity_code,facility_code,severity,facility,ip,app_name,process_id,message_id,structured_data_text,structured_data_UserAgent,structured_data_UserHostAddress,structured_data_BrowserSession,structured_data_Realm,structured_data_Appliance,structured_data_Company,structured_data_UserID,structured_data_PEN,structured_data_HostName,structured_data_Category,structured_data_Priority,message
>>>> FROM cp.`syslog/test.syslog1`
>>>> 18:16:50.061 [main] ERROR org.apache.drill.TestReporter - Test Failed
>>>> (d: 0 B(1 B), h: 21.6 MiB(130.9 MiB), nh: 2.0 MiB(82.2 MiB)):
>>>> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>> java.lang.NullPointerException: null
>>>>        at
>>>> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
>>>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>>>>        at
>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
>>>> ~[test-classes/:na]
>>>>        at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
>>>> 18:16:50.571 [main] ERROR org.apache.drill.TestReporter - Test Failed
>>>> (d: 0 B(1 B), h: 20.5 MiB(209.9 MiB), nh: 411.1 KiB(85.4 MiB)):
>>>> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>> java.lang.NullPointerException: null
>>>>        at
>>>> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
>>>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>>>>        at
>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
>>>> ~[test-classes/:na]
>>>>        at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
>>>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0, Time elapsed:
>>>> 3.173 s <<< FAILURE! - in
>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat
>>>> [ERROR]
>>>> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>> Time elapsed: 0.234 s  <<< ERROR!
>>>> java.lang.NullPointerException
>>>>        at
>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
>>>> 
>>>> [ERROR]
>>>> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>> Time elapsed: 0.125 s  <<< ERROR!
>>>> java.lang.NullPointerException
>>>>        at
>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
>>>> 
>>>> [INFO]
>>>> [INFO] Results:
>>>> [INFO]
>>>> [ERROR] Errors:
>>>> [ERROR]  TestSyslogFormat.testExplicitFlattenedStructuredDataQuery:310
>>>> » NullPointer
>>>> [ERROR]  TestSyslogFormat.testStarFlattenedStructuredDataQuery:248 »
>>>> NullPointer
>>>> [INFO]
>>>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0
>>>> [INFO]
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [INFO] BUILD FAILURE
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [INFO] Total time:  33.919 s
>>>> [INFO] Finished at: 2019-01-24T18:16:51-05:00
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [ERROR] Failed to execute goal
>>>> org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test (default-test)
>>>> on project drill-format-syslog: There are test failures.
>>>> [ERROR]
>>>> [ERROR] Please refer to
>>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>>> for the individual test results.
>>>> [ERROR] Please refer to dump files (if any exist) [date].dump,
>>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>>> [ERROR] -> [Help 1]
>>>> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to
>>>> execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test
>>>> (default-test) on project drill-format-syslog: There are test failures.
>>>> 
>>>> Please refer to
>>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>>> for the individual test results.
>>>> Please refer to dump files (if any exist) [date].dump,
>>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:215)
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:156)
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:148)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>> (LifecycleModuleBuilder.java:117)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>> (LifecycleModuleBuilder.java:81)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>>>> (SingleThreadedBuilder.java:56)
>>>>    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
>>>> (LifecycleStarter.java:128)
>>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>>>>    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>>>>    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>>>>    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>>>>    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>>>>    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>>>>    at sun.reflect.NativeMethodAccessorImpl.invoke
>>>> (NativeMethodAccessorImpl.java:62)
>>>>    at sun.reflect.DelegatingMethodAccessorImpl.invoke
>>>> (DelegatingMethodAccessorImpl.java:43)
>>>>    at java.lang.reflect.Method.invoke (Method.java:497)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
>>>> (Launcher.java:289)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launch
>>>> (Launcher.java:229)
>>>>    at
>>>> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
>>>> (Launcher.java:415)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.main
>>>> (Launcher.java:356)
>>>> Caused by: org.apache.maven.plugin.MojoFailureException: There are test
>>>> failures.
>>>> 
>>>> Please refer to
>>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>>> for the individual test results.
>>>> Please refer to dump files (if any exist) [date].dump,
>>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>>>    at org.apache.maven.plugin.surefire.SurefireHelper.throwException
>>>> (SurefireHelper.java:271)
>>>>    at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution
>>>> (SurefireHelper.java:159)
>>>>    at org.apache.maven.plugin.surefire.SurefirePlugin.handleSummary
>>>> (SurefirePlugin.java:373)
>>>>    at
>>>> org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked
>>>> (AbstractSurefireMojo.java:1016)
>>>>    at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute
>>>> (AbstractSurefireMojo.java:846)
>>>>    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo
>>>> (DefaultBuildPluginManager.java:137)
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:210)
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:156)
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:148)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>> (LifecycleModuleBuilder.java:117)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>> (LifecycleModuleBuilder.java:81)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>>>> (SingleThreadedBuilder.java:56)
>>>>    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
>>>> (LifecycleStarter.java:128)
>>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>>>>    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>>>>    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>>>>    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>>>>    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>>>>    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>>>>    at sun.reflect.NativeMethodAccessorImpl.invoke
>>>> (NativeMethodAccessorImpl.java:62)
>>>>    at sun.reflect.DelegatingMethodAccessorImpl.invoke
>>>> (DelegatingMethodAccessorImpl.java:43)
>>>>    at java.lang.reflect.Method.invoke (Method.java:497)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
>>>> (Launcher.java:289)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launch
>>>> (Launcher.java:229)
>>>>    at
>>>> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
>>>> (Launcher.java:415)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.main
>>>> (Launcher.java:356)
>>>> [ERROR]
>>>> [ERROR]
>>>> [ERROR] For more information about the errors and possible solutions,
>>>> please read the following articles:
>>>> [ERROR] [Help 1]
>>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__cwiki.apache.org_confluence_display_MAVEN_MojoFailureException&d=DwIFaQ&c=cskdkSMqhcnjZxdQVpwTXg&r=HlugibuI4IVjs-VMnFvNTcaBtEaDDqE4Ya96cugWqJ8&m=FYH3iU2DaKxg6aW__bi0KbNTl6lqB4g5vyB4-lVHbPk&s=dp6wxwSeKJ-a87PHGpnBjn7aOhg4DNJucCXguyddfMQ&e=
>>>> 
>>>> 
>>>> If you run the query in sqline, here is the result:
>>>> 
>>>> 
>>>> jdbc:drill:zk=local> select * from dfs.test.`test.syslog1`;
>>>> Error: DATA_READ ERROR: Error parsing file
>>>> 
>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>> 
>>>> Line: 1
>>>> DATA_WRITE ERROR: null
>>>> 
>>>> Could not write string:
>>>> index: 0, length: 342 (expected: range(0, 256))
>>>> 
>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>> 
>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>> Fragment 0:0
>>>> 
>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>> (state=,code=0)
>>>> java.sql.SQLException: DATA_READ ERROR: Error parsing file
>>>> 
>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>> 
>>>> Line: 1
>>>> DATA_WRITE ERROR: null
>>>> 
>>>> Could not write string:
>>>> index: 0, length: 342 (expected: range(0, 256))
>>>> 
>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>> 
>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>> Fragment 0:0
>>>> 
>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.java:536)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillCursor.loadInitialSchema(DrillCursor.java:608)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:1288)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:61)
>>>>        at
>>>> org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1107)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1118)
>>>>        at
>>>> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillConnectionImpl.prepareAndExecuteInternal(DrillConnectionImpl.java:200)
>>>>        at
>>>> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
>>>>        at
>>>> org.apache.calcite.avatica.AvaticaStatement.execute(AvaticaStatement.java:217)
>>>>        at sqlline.Commands.execute(Commands.java:938)
>>>>        at sqlline.Commands.sql(Commands.java:882)
>>>>        at sqlline.SqlLine.dispatch(SqlLine.java:725)
>>>>        at sqlline.SqlLine.begin(SqlLine.java:540)
>>>>        at sqlline.SqlLine.start(SqlLine.java:264)
>>>>        at sqlline.SqlLine.main(SqlLine.java:195)
>>>> Caused by: org.apache.drill.common.exceptions.UserRemoteException:
>>>> DATA_READ ERROR: Error parsing file
>>>> 
>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>> 
>>>> Line: 1
>>>> DATA_WRITE ERROR: null
>>>> 
>>>> Could not write string:
>>>> index: 0, length: 342 (expected: range(0, 256))
>>>> 
>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>> 
>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>> Fragment 0:0
>>>> 
>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>>        at
>>>> org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123)
>>>>        at
>>>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:422)
>>>>        at
>>>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:96)
>>>>        at
>>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:273)
>>>>        at
>>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:243)
>>>>        at
>>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>        at
>>>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>        at
>>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>        at
>>>> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:312)
>>>>        at
>>>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:286)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>        at
>>>> io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>        at
>>>> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
>>>>        at
>>>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>>>>        at
>>>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
>>>>        at
>>>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
>>>>        at
>>>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
>>>>        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
>>>>        at
>>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
>>>>        at java.lang.Thread.run(Thread.java:745)
>>>> Caused by: java.lang.Exception: DATA_READ ERROR: Error parsing file
>>>> 
>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>> 
>>>> Line: 1
>>>> DATA_WRITE ERROR: null
>>>> 
>>>> Could not write string:
>>>> index: 0, length: 342 (expected: range(0, 256))
>>>> 
>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>> 
>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>> Fragment 0:0
>>>> 
>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>>        at
>>>> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:633)
>>>>        at
>>>> org.apache.drill.exec.store.syslog.SyslogRecordReader.next(SyslogRecordReader.java:169)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.ScanBatch.internalNext(ScanBatch.java:223)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.ScanBatch.next(ScanBatch.java:271)
>>>>        at
>>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:126)
>>>>        at
>>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:116)
>>>>        at
>>>> org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext(AbstractUnaryRecordBatch.java:63)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:143)
>>>>        at
>>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:186)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:104)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:83)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:94)
>>>>        at org.apache.drill.exec.work
>>>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:297)
>>>>        at org.apache.drill.exec.work
>>>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:284)
>>>>        at .......(:0)
>>>>        at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
>>>>        at org.apache.drill.exec.work
>>>> .fragment.FragmentExecutor.run(FragmentExecutor.java:284)
>>>>        at
>>>> org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38)
>>>>        at .......(:0)
>>>> 
>>>> 
>>>> If you run a query w/o long fields it works fine:
>>>> dbc:drill:zk=local> select * from dfs.test.`logs.syslog1`;
>>>> 
>>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>>> |        event_date        | severity_code  | facility_code  | severity
>>>> | facility  |          ip          | app_name  | message_id  |
>>>>        message                    | process_id  |
>>>>      structured_data_text                              |
>>>>            structured_data                              |
>>>> 
>>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>                                                            | {}
>>>>                                                            |
>>>> | 1985-04-12 23:20:50.52  | 2              | 4              | CRIT
>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>                                                            | {}
>>>>                                                            |
>>>> | 1985-04-12 23:20:50.52  | 2              | 4              | CRIT
>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>                                                            | {}
>>>>                                                            |
>>>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>                                                            | {}
>>>>                                                            |
>>>> | 2003-08-24 12:14:15.0    | 2              | 4              | CRIT
>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>                                                            | {}
>>>>                                                            |
>>>> | 2003-08-24 12:14:15.0    | 5              | 20            | NOTICE
>>>> | LOCAL4    | 192.0.2.1              | myproc    | null        | %% It's
>>>> time to make the do-nuts.              | 8710        | null
>>>>                                                            | {}
>>>>                                                            |
>>>> | 2003-10-11 22:14:15.003  | 5              | 20            | NOTICE
>>>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | null
>>>>                                        | null        |
>>>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
>>>> eventSource=Application, eventID=1011]} |
>>>> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
>>>> | 2003-10-11 22:14:15.003  | 5              | 20            | NOTICE
>>>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | - and
>>>> thats a wrap!                            | null        |
>>>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
>>>> eventSource=Application, eventID=1011]} |
>>>> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
>>>> 
>>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>>> 8 rows selected (1.702 seconds)
>>>> 
>>>> 
>>>>> On Jan 24, 2019, at 17:02, Karthikeyan Manivannan <
>>>> kmanivannan@mapr.com> wrote:
>>>>> 
>>>>> Hi Charles,
>>>>> 
>>>>> Can you please provide the stack trace.
>>>>> 
>>>>> Thanks
>>>>> 
>>>>> Karthik
>>>>> 
>>>>> On Tue, Jan 22, 2019 at 9:18 PM Charles Givre <cg...@gmail.com>
>>>> wrote:
>>>>> 
>>>>>> Hello all,
>>>>>> I’m working on a format plugin to read syslog data, and have
>>>> encountered
>>>>>> what seems to be a bit of a regression (maybe).  The code below is a
>>>>>> helper function which writes strings from the data.  As of Drill
>>>> 1.16, the
>>>>>> varchar holder seems to throw an error if the string you are trying to
>>>>>> write is > 256 characters.  Is there a workaround?
>>>>>> 
>>>>>> Thanks!
>>>>>> 
>>>>>> 
>>>>>> //Helper function to map strings
>>>>>> private void mapStringField(String name, String value,
>>>>>> BaseWriter.MapWriter map) {
>>>>>> if (value == null) {
>>>>>>  return;
>>>>>> }
>>>>>> try {
>>>>>>  byte[] bytes = value.getBytes("UTF-8");
>>>>>>  int stringLength = bytes.length;
>>>>>>  this.buffer.setBytes(0, bytes, 0, stringLength);
>>>>>>  map.varChar(name).writeVarChar(0, stringLength, buffer);
>>>>>> } catch (Exception e) {
>>>>>>  throw UserException
>>>>>>          .dataWriteError()
>>>>>>          .addContext("Could not write string: ")
>>>>>>          .addContext(e.getMessage())
>>>>>>          .build(logger);
>>>>>> }
>>>>>> }
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>>>> 
  

Re: Regression? Drill Truncating Varchars

Posted by Karthikeyan Manivannan <km...@mapr.com>.
Hi Charles,

setBytes() does not check for overflow and do a realloc().

Not sure why this fails only in 1.16.
git tells me that the code for BufferManagerImpl getManagedBufferr() has
not changed since 2015.

I was looking at other plugins to see how this is handled and I saw that in
CompliantTextRecordReader the buffer sizes are hard coded

private static final int READ_BUFFER = 1024*1024;
...
...
readBuffer = context.getAllocator().buffer(READ_BUFFER);


I am not sure what the best fix in your case is, but maybe you can use:

final int SYSLOG_BUFFER = 1024 * 1024;
this.buffer = getManagedBuffer(SYSLOG_BUFFER);

and throw an exception if the stringLength > SYSLOG_BUFFER.

Thanks.

Karthik


On Sat, Jan 26, 2019 at 3:51 PM Charles Givre <cg...@gmail.com> wrote:

> HI Karthik,
> Thanks for your response.  This was working in Drill 1.15 when I submitted
> the original PR.  I updated the code for Drill 1.16 and suddenly the unit
> tests started failing and I couldn’t figure out why.  After some digging, I
> found that the buffer situation was the issue.  I was under the impression
> that the getManagedBuffer() function would allocate and/or reallocate space
> as needed but I guess that is not the case.  Did something change with the
> BufferManagerImpl class for Drill 1.16?
>
> In any event, what would the best practice be?  Would you suggest just
> setting the buffer size to a large value and hope for the best?
> Thanks!
> — C
>
>
>
>
> > On Jan 25, 2019, at 20:17, Karthikeyan Manivannan <km...@mapr.com>
> wrote:
> >
> > Your buffer is allocated as
> >
> > this.buffer = context.getManagedBuffer();
> >
> > and that boils down to
> >
> > BufferManagerImpl.java
> >
> > @Override
> > public DrillBuf getManagedBuffer() {
> >  return getManagedBuffer(256);
> > }
> >
> > This maybe the root of the problem.
> >
> > Thanks.
> >
> > Karthik
> >
> > On Fri, Jan 25, 2019 at 3:02 PM Karthikeyan Manivannan <
> kmanivannan@mapr.com>
> > wrote:
> >
> >> Hi Charles,
> >>
> >> In the code that you had pasted
> >>
> >> this.buffer.setBytes(0, bytes, 0, stringLength);
> >>
> >> what guarantees that this.buffer has enough space for stringLength
> chars?
> >>
> >> Karthik
> >>
> >> On Fri, Jan 25, 2019 at 2:32 PM Karthikeyan Manivannan <
> >> kmanivannan@mapr.com> wrote:
> >>
> >>> Charles,
> >>>
> >>> Does this work on 1.15 ?
> >>>
> >>> Drill 1.16 is able to correctly read CSV files with > 256 char strings,
> >>> so I guess the problem might be in the Syslog plugin code.
> >>> Can you share your format plugin code ?
> >>>
> >>> Thanks.
> >>>
> >>> Karthik
> >>>
> >>>
> >>> On Thu, Jan 24, 2019 at 4:24 PM Charles Givre <cg...@gmail.com>
> wrote:
> >>>
> >>>> Here you go… Thanks for your help!
> >>>>
> >>>>
> >>>> SELECT
> >>>>
> event_date,severity_code,facility_code,severity,facility,ip,app_name,process_id,message_id,structured_data_text,structured_data_UserAgent,structured_data_UserHostAddress,structured_data_BrowserSession,structured_data_Realm,structured_data_Appliance,structured_data_Company,structured_data_UserID,structured_data_PEN,structured_data_HostName,structured_data_Category,structured_data_Priority,message
> >>>> FROM cp.`syslog/test.syslog1`
> >>>> 18:16:50.061 [main] ERROR org.apache.drill.TestReporter - Test Failed
> >>>> (d: 0 B(1 B), h: 21.6 MiB(130.9 MiB), nh: 2.0 MiB(82.2 MiB)):
> >>>>
> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
> >>>> java.lang.NullPointerException: null
> >>>>        at
> >>>>
> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
> >>>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
> >>>>        at
> >>>>
> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
> >>>> ~[test-classes/:na]
> >>>>        at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
> >>>> 18:16:50.571 [main] ERROR org.apache.drill.TestReporter - Test Failed
> >>>> (d: 0 B(1 B), h: 20.5 MiB(209.9 MiB), nh: 411.1 KiB(85.4 MiB)):
> >>>>
> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
> >>>> java.lang.NullPointerException: null
> >>>>        at
> >>>>
> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
> >>>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
> >>>>        at
> >>>>
> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
> >>>> ~[test-classes/:na]
> >>>>        at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
> >>>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0, Time
> elapsed:
> >>>> 3.173 s <<< FAILURE! - in
> >>>> org.apache.drill.exec.store.syslog.TestSyslogFormat
> >>>> [ERROR]
> >>>>
> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
> >>>> Time elapsed: 0.234 s  <<< ERROR!
> >>>> java.lang.NullPointerException
> >>>>        at
> >>>>
> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
> >>>>
> >>>> [ERROR]
> >>>>
> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
> >>>> Time elapsed: 0.125 s  <<< ERROR!
> >>>> java.lang.NullPointerException
> >>>>        at
> >>>>
> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
> >>>>
> >>>> [INFO]
> >>>> [INFO] Results:
> >>>> [INFO]
> >>>> [ERROR] Errors:
> >>>> [ERROR]
>  TestSyslogFormat.testExplicitFlattenedStructuredDataQuery:310
> >>>> » NullPointer
> >>>> [ERROR]   TestSyslogFormat.testStarFlattenedStructuredDataQuery:248 »
> >>>> NullPointer
> >>>> [INFO]
> >>>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0
> >>>> [INFO]
> >>>> [INFO]
> >>>>
> ------------------------------------------------------------------------
> >>>> [INFO] BUILD FAILURE
> >>>> [INFO]
> >>>>
> ------------------------------------------------------------------------
> >>>> [INFO] Total time:  33.919 s
> >>>> [INFO] Finished at: 2019-01-24T18:16:51-05:00
> >>>> [INFO]
> >>>>
> ------------------------------------------------------------------------
> >>>> [ERROR] Failed to execute goal
> >>>> org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test
> (default-test)
> >>>> on project drill-format-syslog: There are test failures.
> >>>> [ERROR]
> >>>> [ERROR] Please refer to
> >>>>
> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
> >>>> for the individual test results.
> >>>> [ERROR] Please refer to dump files (if any exist) [date].dump,
> >>>> [date]-jvmRun[N].dump and [date].dumpstream.
> >>>> [ERROR] -> [Help 1]
> >>>> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to
> >>>> execute goal
> org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test
> >>>> (default-test) on project drill-format-syslog: There are test
> failures.
> >>>>
> >>>> Please refer to
> >>>>
> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
> >>>> for the individual test results.
> >>>> Please refer to dump files (if any exist) [date].dump,
> >>>> [date]-jvmRun[N].dump and [date].dumpstream.
> >>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> >>>> (MojoExecutor.java:215)
> >>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> >>>> (MojoExecutor.java:156)
> >>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> >>>> (MojoExecutor.java:148)
> >>>>    at
> >>>>
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
> >>>> (LifecycleModuleBuilder.java:117)
> >>>>    at
> >>>>
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
> >>>> (LifecycleModuleBuilder.java:81)
> >>>>    at
> >>>>
> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
> >>>> (SingleThreadedBuilder.java:56)
> >>>>    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
> >>>> (LifecycleStarter.java:128)
> >>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
> >>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
> >>>>    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
> >>>>    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
> >>>>    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
> >>>>    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
> >>>>    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
> >>>>    at sun.reflect.NativeMethodAccessorImpl.invoke
> >>>> (NativeMethodAccessorImpl.java:62)
> >>>>    at sun.reflect.DelegatingMethodAccessorImpl.invoke
> >>>> (DelegatingMethodAccessorImpl.java:43)
> >>>>    at java.lang.reflect.Method.invoke (Method.java:497)
> >>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
> >>>> (Launcher.java:289)
> >>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launch
> >>>> (Launcher.java:229)
> >>>>    at
> >>>> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
> >>>> (Launcher.java:415)
> >>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.main
> >>>> (Launcher.java:356)
> >>>> Caused by: org.apache.maven.plugin.MojoFailureException: There are
> test
> >>>> failures.
> >>>>
> >>>> Please refer to
> >>>>
> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
> >>>> for the individual test results.
> >>>> Please refer to dump files (if any exist) [date].dump,
> >>>> [date]-jvmRun[N].dump and [date].dumpstream.
> >>>>    at org.apache.maven.plugin.surefire.SurefireHelper.throwException
> >>>> (SurefireHelper.java:271)
> >>>>    at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution
> >>>> (SurefireHelper.java:159)
> >>>>    at org.apache.maven.plugin.surefire.SurefirePlugin.handleSummary
> >>>> (SurefirePlugin.java:373)
> >>>>    at
> >>>>
> org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked
> >>>> (AbstractSurefireMojo.java:1016)
> >>>>    at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute
> >>>> (AbstractSurefireMojo.java:846)
> >>>>    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo
> >>>> (DefaultBuildPluginManager.java:137)
> >>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> >>>> (MojoExecutor.java:210)
> >>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> >>>> (MojoExecutor.java:156)
> >>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> >>>> (MojoExecutor.java:148)
> >>>>    at
> >>>>
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
> >>>> (LifecycleModuleBuilder.java:117)
> >>>>    at
> >>>>
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
> >>>> (LifecycleModuleBuilder.java:81)
> >>>>    at
> >>>>
> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
> >>>> (SingleThreadedBuilder.java:56)
> >>>>    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
> >>>> (LifecycleStarter.java:128)
> >>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
> >>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
> >>>>    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
> >>>>    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
> >>>>    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
> >>>>    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
> >>>>    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
> >>>>    at sun.reflect.NativeMethodAccessorImpl.invoke
> >>>> (NativeMethodAccessorImpl.java:62)
> >>>>    at sun.reflect.DelegatingMethodAccessorImpl.invoke
> >>>> (DelegatingMethodAccessorImpl.java:43)
> >>>>    at java.lang.reflect.Method.invoke (Method.java:497)
> >>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
> >>>> (Launcher.java:289)
> >>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launch
> >>>> (Launcher.java:229)
> >>>>    at
> >>>> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
> >>>> (Launcher.java:415)
> >>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.main
> >>>> (Launcher.java:356)
> >>>> [ERROR]
> >>>> [ERROR]
> >>>> [ERROR] For more information about the errors and possible solutions,
> >>>> please read the following articles:
> >>>> [ERROR] [Help 1]
> >>>>
> https://urldefense.proofpoint.com/v2/url?u=http-3A__cwiki.apache.org_confluence_display_MAVEN_MojoFailureException&d=DwIFaQ&c=cskdkSMqhcnjZxdQVpwTXg&r=HlugibuI4IVjs-VMnFvNTcaBtEaDDqE4Ya96cugWqJ8&m=FYH3iU2DaKxg6aW__bi0KbNTl6lqB4g5vyB4-lVHbPk&s=dp6wxwSeKJ-a87PHGpnBjn7aOhg4DNJucCXguyddfMQ&e=
> >>>>
> >>>>
> >>>> If you run the query in sqline, here is the result:
> >>>>
> >>>>
> >>>> jdbc:drill:zk=local> select * from dfs.test.`test.syslog1`;
> >>>> Error: DATA_READ ERROR: Error parsing file
> >>>>
> >>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
> >>>>
> >>>> Line: 1
> >>>> DATA_WRITE ERROR: null
> >>>>
> >>>> Could not write string:
> >>>> index: 0, length: 342 (expected: range(0, 256))
> >>>>
> >>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
> >>>>
> >>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
> >>>> Fragment 0:0
> >>>>
> >>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010
> ]
> >>>> (state=,code=0)
> >>>> java.sql.SQLException: DATA_READ ERROR: Error parsing file
> >>>>
> >>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
> >>>>
> >>>> Line: 1
> >>>> DATA_WRITE ERROR: null
> >>>>
> >>>> Could not write string:
> >>>> index: 0, length: 342 (expected: range(0, 256))
> >>>>
> >>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
> >>>>
> >>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
> >>>> Fragment 0:0
> >>>>
> >>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010
> ]
> >>>>        at
> >>>>
> org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.java:536)
> >>>>        at
> >>>>
> org.apache.drill.jdbc.impl.DrillCursor.loadInitialSchema(DrillCursor.java:608)
> >>>>        at
> >>>>
> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:1288)
> >>>>        at
> >>>>
> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:61)
> >>>>        at
> >>>>
> org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
> >>>>        at
> >>>>
> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1107)
> >>>>        at
> >>>>
> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1118)
> >>>>        at
> >>>>
> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
> >>>>        at
> >>>>
> org.apache.drill.jdbc.impl.DrillConnectionImpl.prepareAndExecuteInternal(DrillConnectionImpl.java:200)
> >>>>        at
> >>>>
> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
> >>>>        at
> >>>>
> org.apache.calcite.avatica.AvaticaStatement.execute(AvaticaStatement.java:217)
> >>>>        at sqlline.Commands.execute(Commands.java:938)
> >>>>        at sqlline.Commands.sql(Commands.java:882)
> >>>>        at sqlline.SqlLine.dispatch(SqlLine.java:725)
> >>>>        at sqlline.SqlLine.begin(SqlLine.java:540)
> >>>>        at sqlline.SqlLine.start(SqlLine.java:264)
> >>>>        at sqlline.SqlLine.main(SqlLine.java:195)
> >>>> Caused by: org.apache.drill.common.exceptions.UserRemoteException:
> >>>> DATA_READ ERROR: Error parsing file
> >>>>
> >>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
> >>>>
> >>>> Line: 1
> >>>> DATA_WRITE ERROR: null
> >>>>
> >>>> Could not write string:
> >>>> index: 0, length: 342 (expected: range(0, 256))
> >>>>
> >>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
> >>>>
> >>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
> >>>> Fragment 0:0
> >>>>
> >>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010
> ]
> >>>>        at
> >>>>
> org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123)
> >>>>        at
> >>>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:422)
> >>>>        at
> >>>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:96)
> >>>>        at
> >>>>
> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:273)
> >>>>        at
> >>>>
> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:243)
> >>>>        at
> >>>>
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
> >>>>        at
> >>>>
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
> >>>>        at
> >>>>
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
> >>>>        at
> >>>>
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:312)
> >>>>        at
> >>>>
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:286)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
> >>>>        at
> >>>>
> io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
> >>>>        at
> >>>>
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
> >>>>        at
> >>>>
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
> >>>>        at
> >>>>
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
> >>>>        at
> >>>>
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> >>>>        at
> >>>>
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
> >>>>        at
> >>>>
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
> >>>>        at
> >>>>
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
> >>>>        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
> >>>>        at
> >>>>
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
> >>>>        at java.lang.Thread.run(Thread.java:745)
> >>>> Caused by: java.lang.Exception: DATA_READ ERROR: Error parsing file
> >>>>
> >>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
> >>>>
> >>>> Line: 1
> >>>> DATA_WRITE ERROR: null
> >>>>
> >>>> Could not write string:
> >>>> index: 0, length: 342 (expected: range(0, 256))
> >>>>
> >>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
> >>>>
> >>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
> >>>> Fragment 0:0
> >>>>
> >>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010
> ]
> >>>>        at
> >>>>
> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:633)
> >>>>        at
> >>>>
> org.apache.drill.exec.store.syslog.SyslogRecordReader.next(SyslogRecordReader.java:169)
> >>>>        at
> >>>>
> org.apache.drill.exec.physical.impl.ScanBatch.internalNext(ScanBatch.java:223)
> >>>>        at
> >>>> org.apache.drill.exec.physical.impl.ScanBatch.next(ScanBatch.java:271)
> >>>>        at
> >>>>
> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:126)
> >>>>        at
> >>>>
> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:116)
> >>>>        at
> >>>>
> org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext(AbstractUnaryRecordBatch.java:63)
> >>>>        at
> >>>>
> org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:143)
> >>>>        at
> >>>>
> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:186)
> >>>>        at
> >>>>
> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:104)
> >>>>        at
> >>>>
> org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:83)
> >>>>        at
> >>>>
> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:94)
> >>>>        at org.apache.drill.exec.work
> >>>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:297)
> >>>>        at org.apache.drill.exec.work
> >>>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:284)
> >>>>        at .......(:0)
> >>>>        at
> >>>>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
> >>>>        at org.apache.drill.exec.work
> >>>> .fragment.FragmentExecutor.run(FragmentExecutor.java:284)
> >>>>        at
> >>>>
> org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38)
> >>>>        at .......(:0)
> >>>>
> >>>>
> >>>> If you run a query w/o long fields it works fine:
> >>>> dbc:drill:zk=local> select * from dfs.test.`logs.syslog1`;
> >>>>
> >>>>
> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
> >>>> |        event_date        | severity_code  | facility_code  |
> severity
> >>>> | facility  |           ip           | app_name  | message_id  |
> >>>>        message                     | process_id  |
> >>>>       structured_data_text                               |
> >>>>             structured_data                              |
> >>>>
> >>>>
> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
> >>>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
> >>>> | AUTH      | mymachine.example.com  | su        | ID47        |
> BOM'su
> >>>> root' failed for lonvick on /dev/pts/8  | null        | null
> >>>>                                                             | {}
> >>>>                                                             |
> >>>> | 1985-04-12 23:20:50.52   | 2              | 4              | CRIT
> >>>> | AUTH      | mymachine.example.com  | su        | ID47        |
> BOM'su
> >>>> root' failed for lonvick on /dev/pts/8  | null        | null
> >>>>                                                             | {}
> >>>>                                                             |
> >>>> | 1985-04-12 23:20:50.52   | 2              | 4              | CRIT
> >>>> | AUTH      | mymachine.example.com  | su        | ID47        |
> BOM'su
> >>>> root' failed for lonvick on /dev/pts/8  | null        | null
> >>>>                                                             | {}
> >>>>                                                             |
> >>>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
> >>>> | AUTH      | mymachine.example.com  | su        | ID47        |
> BOM'su
> >>>> root' failed for lonvick on /dev/pts/8  | null        | null
> >>>>                                                             | {}
> >>>>                                                             |
> >>>> | 2003-08-24 12:14:15.0    | 2              | 4              | CRIT
> >>>> | AUTH      | mymachine.example.com  | su        | ID47        |
> BOM'su
> >>>> root' failed for lonvick on /dev/pts/8  | null        | null
> >>>>                                                             | {}
> >>>>                                                             |
> >>>> | 2003-08-24 12:14:15.0    | 5              | 20             | NOTICE
> >>>> | LOCAL4    | 192.0.2.1              | myproc    | null        | %%
> It's
> >>>> time to make the do-nuts.              | 8710        | null
> >>>>                                                             | {}
> >>>>                                                             |
> >>>> | 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE
> >>>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | null
> >>>>                                         | null        |
> >>>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
> >>>> eventSource=Application, eventID=1011]} |
> >>>>
> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
> >>>> | 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE
> >>>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | -
> and
> >>>> thats a wrap!                            | null        |
> >>>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
> >>>> eventSource=Application, eventID=1011]} |
> >>>>
> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
> >>>>
> >>>>
> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
> >>>> 8 rows selected (1.702 seconds)
> >>>>
> >>>>
> >>>>> On Jan 24, 2019, at 17:02, Karthikeyan Manivannan <
> >>>> kmanivannan@mapr.com> wrote:
> >>>>>
> >>>>> Hi Charles,
> >>>>>
> >>>>> Can you please provide the stack trace.
> >>>>>
> >>>>> Thanks
> >>>>>
> >>>>> Karthik
> >>>>>
> >>>>> On Tue, Jan 22, 2019 at 9:18 PM Charles Givre <cg...@gmail.com>
> >>>> wrote:
> >>>>>
> >>>>>> Hello all,
> >>>>>> I’m working on a format plugin to read syslog data, and have
> >>>> encountered
> >>>>>> what seems to be a bit of a regression (maybe).   The code below is
> a
> >>>>>> helper function which writes strings from the data.  As of Drill
> >>>> 1.16, the
> >>>>>> varchar holder seems to throw an error if the string you are trying
> to
> >>>>>> write is > 256 characters.  Is there a workaround?
> >>>>>>
> >>>>>> Thanks!
> >>>>>>
> >>>>>>
> >>>>>> //Helper function to map strings
> >>>>>> private void mapStringField(String name, String value,
> >>>>>> BaseWriter.MapWriter map) {
> >>>>>> if (value == null) {
> >>>>>>   return;
> >>>>>> }
> >>>>>> try {
> >>>>>>   byte[] bytes = value.getBytes("UTF-8");
> >>>>>>   int stringLength = bytes.length;
> >>>>>>   this.buffer.setBytes(0, bytes, 0, stringLength);
> >>>>>>   map.varChar(name).writeVarChar(0, stringLength, buffer);
> >>>>>> } catch (Exception e) {
> >>>>>>   throw UserException
> >>>>>>           .dataWriteError()
> >>>>>>           .addContext("Could not write string: ")
> >>>>>>           .addContext(e.getMessage())
> >>>>>>           .build(logger);
> >>>>>> }
> >>>>>> }
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>
> >>>>
>
>

Re: Regression? Drill Truncating Varchars

Posted by Charles Givre <cg...@gmail.com>.
HI Karthik, 
Thanks for your response.  This was working in Drill 1.15 when I submitted the original PR.  I updated the code for Drill 1.16 and suddenly the unit tests started failing and I couldn’t figure out why.  After some digging, I found that the buffer situation was the issue.  I was under the impression that the getManagedBuffer() function would allocate and/or reallocate space as needed but I guess that is not the case.  Did something change with the BufferManagerImpl class for Drill 1.16? 

In any event, what would the best practice be?  Would you suggest just setting the buffer size to a large value and hope for the best?  
Thanks!
— C




> On Jan 25, 2019, at 20:17, Karthikeyan Manivannan <km...@mapr.com> wrote:
> 
> Your buffer is allocated as
> 
> this.buffer = context.getManagedBuffer();
> 
> and that boils down to
> 
> BufferManagerImpl.java
> 
> @Override
> public DrillBuf getManagedBuffer() {
>  return getManagedBuffer(256);
> }
> 
> This maybe the root of the problem.
> 
> Thanks.
> 
> Karthik
> 
> On Fri, Jan 25, 2019 at 3:02 PM Karthikeyan Manivannan <km...@mapr.com>
> wrote:
> 
>> Hi Charles,
>> 
>> In the code that you had pasted
>> 
>> this.buffer.setBytes(0, bytes, 0, stringLength);
>> 
>> what guarantees that this.buffer has enough space for stringLength chars?
>> 
>> Karthik
>> 
>> On Fri, Jan 25, 2019 at 2:32 PM Karthikeyan Manivannan <
>> kmanivannan@mapr.com> wrote:
>> 
>>> Charles,
>>> 
>>> Does this work on 1.15 ?
>>> 
>>> Drill 1.16 is able to correctly read CSV files with > 256 char strings,
>>> so I guess the problem might be in the Syslog plugin code.
>>> Can you share your format plugin code ?
>>> 
>>> Thanks.
>>> 
>>> Karthik
>>> 
>>> 
>>> On Thu, Jan 24, 2019 at 4:24 PM Charles Givre <cg...@gmail.com> wrote:
>>> 
>>>> Here you go… Thanks for your help!
>>>> 
>>>> 
>>>> SELECT
>>>> event_date,severity_code,facility_code,severity,facility,ip,app_name,process_id,message_id,structured_data_text,structured_data_UserAgent,structured_data_UserHostAddress,structured_data_BrowserSession,structured_data_Realm,structured_data_Appliance,structured_data_Company,structured_data_UserID,structured_data_PEN,structured_data_HostName,structured_data_Category,structured_data_Priority,message
>>>> FROM cp.`syslog/test.syslog1`
>>>> 18:16:50.061 [main] ERROR org.apache.drill.TestReporter - Test Failed
>>>> (d: 0 B(1 B), h: 21.6 MiB(130.9 MiB), nh: 2.0 MiB(82.2 MiB)):
>>>> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>> java.lang.NullPointerException: null
>>>>        at
>>>> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
>>>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>>>>        at
>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
>>>> ~[test-classes/:na]
>>>>        at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
>>>> 18:16:50.571 [main] ERROR org.apache.drill.TestReporter - Test Failed
>>>> (d: 0 B(1 B), h: 20.5 MiB(209.9 MiB), nh: 411.1 KiB(85.4 MiB)):
>>>> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>> java.lang.NullPointerException: null
>>>>        at
>>>> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
>>>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>>>>        at
>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
>>>> ~[test-classes/:na]
>>>>        at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
>>>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0, Time elapsed:
>>>> 3.173 s <<< FAILURE! - in
>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat
>>>> [ERROR]
>>>> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>> Time elapsed: 0.234 s  <<< ERROR!
>>>> java.lang.NullPointerException
>>>>        at
>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
>>>> 
>>>> [ERROR]
>>>> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>>> Time elapsed: 0.125 s  <<< ERROR!
>>>> java.lang.NullPointerException
>>>>        at
>>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
>>>> 
>>>> [INFO]
>>>> [INFO] Results:
>>>> [INFO]
>>>> [ERROR] Errors:
>>>> [ERROR]   TestSyslogFormat.testExplicitFlattenedStructuredDataQuery:310
>>>> » NullPointer
>>>> [ERROR]   TestSyslogFormat.testStarFlattenedStructuredDataQuery:248 »
>>>> NullPointer
>>>> [INFO]
>>>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0
>>>> [INFO]
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [INFO] BUILD FAILURE
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [INFO] Total time:  33.919 s
>>>> [INFO] Finished at: 2019-01-24T18:16:51-05:00
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [ERROR] Failed to execute goal
>>>> org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test (default-test)
>>>> on project drill-format-syslog: There are test failures.
>>>> [ERROR]
>>>> [ERROR] Please refer to
>>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>>> for the individual test results.
>>>> [ERROR] Please refer to dump files (if any exist) [date].dump,
>>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>>> [ERROR] -> [Help 1]
>>>> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to
>>>> execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test
>>>> (default-test) on project drill-format-syslog: There are test failures.
>>>> 
>>>> Please refer to
>>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>>> for the individual test results.
>>>> Please refer to dump files (if any exist) [date].dump,
>>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:215)
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:156)
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:148)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>> (LifecycleModuleBuilder.java:117)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>> (LifecycleModuleBuilder.java:81)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>>>> (SingleThreadedBuilder.java:56)
>>>>    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
>>>> (LifecycleStarter.java:128)
>>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>>>>    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>>>>    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>>>>    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>>>>    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>>>>    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>>>>    at sun.reflect.NativeMethodAccessorImpl.invoke
>>>> (NativeMethodAccessorImpl.java:62)
>>>>    at sun.reflect.DelegatingMethodAccessorImpl.invoke
>>>> (DelegatingMethodAccessorImpl.java:43)
>>>>    at java.lang.reflect.Method.invoke (Method.java:497)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
>>>> (Launcher.java:289)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launch
>>>> (Launcher.java:229)
>>>>    at
>>>> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
>>>> (Launcher.java:415)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.main
>>>> (Launcher.java:356)
>>>> Caused by: org.apache.maven.plugin.MojoFailureException: There are test
>>>> failures.
>>>> 
>>>> Please refer to
>>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>>> for the individual test results.
>>>> Please refer to dump files (if any exist) [date].dump,
>>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>>>    at org.apache.maven.plugin.surefire.SurefireHelper.throwException
>>>> (SurefireHelper.java:271)
>>>>    at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution
>>>> (SurefireHelper.java:159)
>>>>    at org.apache.maven.plugin.surefire.SurefirePlugin.handleSummary
>>>> (SurefirePlugin.java:373)
>>>>    at
>>>> org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked
>>>> (AbstractSurefireMojo.java:1016)
>>>>    at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute
>>>> (AbstractSurefireMojo.java:846)
>>>>    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo
>>>> (DefaultBuildPluginManager.java:137)
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:210)
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:156)
>>>>    at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>>> (MojoExecutor.java:148)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>> (LifecycleModuleBuilder.java:117)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>>> (LifecycleModuleBuilder.java:81)
>>>>    at
>>>> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>>>> (SingleThreadedBuilder.java:56)
>>>>    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
>>>> (LifecycleStarter.java:128)
>>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>>>>    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>>>>    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>>>>    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>>>>    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>>>>    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>>>>    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>>>>    at sun.reflect.NativeMethodAccessorImpl.invoke
>>>> (NativeMethodAccessorImpl.java:62)
>>>>    at sun.reflect.DelegatingMethodAccessorImpl.invoke
>>>> (DelegatingMethodAccessorImpl.java:43)
>>>>    at java.lang.reflect.Method.invoke (Method.java:497)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
>>>> (Launcher.java:289)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.launch
>>>> (Launcher.java:229)
>>>>    at
>>>> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
>>>> (Launcher.java:415)
>>>>    at org.codehaus.plexus.classworlds.launcher.Launcher.main
>>>> (Launcher.java:356)
>>>> [ERROR]
>>>> [ERROR]
>>>> [ERROR] For more information about the errors and possible solutions,
>>>> please read the following articles:
>>>> [ERROR] [Help 1]
>>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__cwiki.apache.org_confluence_display_MAVEN_MojoFailureException&d=DwIFaQ&c=cskdkSMqhcnjZxdQVpwTXg&r=HlugibuI4IVjs-VMnFvNTcaBtEaDDqE4Ya96cugWqJ8&m=FYH3iU2DaKxg6aW__bi0KbNTl6lqB4g5vyB4-lVHbPk&s=dp6wxwSeKJ-a87PHGpnBjn7aOhg4DNJucCXguyddfMQ&e=
>>>> 
>>>> 
>>>> If you run the query in sqline, here is the result:
>>>> 
>>>> 
>>>> jdbc:drill:zk=local> select * from dfs.test.`test.syslog1`;
>>>> Error: DATA_READ ERROR: Error parsing file
>>>> 
>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>> 
>>>> Line: 1
>>>> DATA_WRITE ERROR: null
>>>> 
>>>> Could not write string:
>>>> index: 0, length: 342 (expected: range(0, 256))
>>>> 
>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>> 
>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>> Fragment 0:0
>>>> 
>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>> (state=,code=0)
>>>> java.sql.SQLException: DATA_READ ERROR: Error parsing file
>>>> 
>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>> 
>>>> Line: 1
>>>> DATA_WRITE ERROR: null
>>>> 
>>>> Could not write string:
>>>> index: 0, length: 342 (expected: range(0, 256))
>>>> 
>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>> 
>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>> Fragment 0:0
>>>> 
>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.java:536)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillCursor.loadInitialSchema(DrillCursor.java:608)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:1288)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:61)
>>>>        at
>>>> org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1107)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1118)
>>>>        at
>>>> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
>>>>        at
>>>> org.apache.drill.jdbc.impl.DrillConnectionImpl.prepareAndExecuteInternal(DrillConnectionImpl.java:200)
>>>>        at
>>>> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
>>>>        at
>>>> org.apache.calcite.avatica.AvaticaStatement.execute(AvaticaStatement.java:217)
>>>>        at sqlline.Commands.execute(Commands.java:938)
>>>>        at sqlline.Commands.sql(Commands.java:882)
>>>>        at sqlline.SqlLine.dispatch(SqlLine.java:725)
>>>>        at sqlline.SqlLine.begin(SqlLine.java:540)
>>>>        at sqlline.SqlLine.start(SqlLine.java:264)
>>>>        at sqlline.SqlLine.main(SqlLine.java:195)
>>>> Caused by: org.apache.drill.common.exceptions.UserRemoteException:
>>>> DATA_READ ERROR: Error parsing file
>>>> 
>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>> 
>>>> Line: 1
>>>> DATA_WRITE ERROR: null
>>>> 
>>>> Could not write string:
>>>> index: 0, length: 342 (expected: range(0, 256))
>>>> 
>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>> 
>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>> Fragment 0:0
>>>> 
>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>>        at
>>>> org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123)
>>>>        at
>>>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:422)
>>>>        at
>>>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:96)
>>>>        at
>>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:273)
>>>>        at
>>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:243)
>>>>        at
>>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>        at
>>>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>        at
>>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>        at
>>>> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:312)
>>>>        at
>>>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:286)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>        at
>>>> io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>>        at
>>>> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>>        at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>>        at
>>>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
>>>>        at
>>>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>>>>        at
>>>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
>>>>        at
>>>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
>>>>        at
>>>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
>>>>        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
>>>>        at
>>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
>>>>        at java.lang.Thread.run(Thread.java:745)
>>>> Caused by: java.lang.Exception: DATA_READ ERROR: Error parsing file
>>>> 
>>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>> 
>>>> Line: 1
>>>> DATA_WRITE ERROR: null
>>>> 
>>>> Could not write string:
>>>> index: 0, length: 342 (expected: range(0, 256))
>>>> 
>>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>> 
>>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>>> Fragment 0:0
>>>> 
>>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>>        at
>>>> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:633)
>>>>        at
>>>> org.apache.drill.exec.store.syslog.SyslogRecordReader.next(SyslogRecordReader.java:169)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.ScanBatch.internalNext(ScanBatch.java:223)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.ScanBatch.next(ScanBatch.java:271)
>>>>        at
>>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:126)
>>>>        at
>>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:116)
>>>>        at
>>>> org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext(AbstractUnaryRecordBatch.java:63)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:143)
>>>>        at
>>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:186)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:104)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:83)
>>>>        at
>>>> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:94)
>>>>        at org.apache.drill.exec.work
>>>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:297)
>>>>        at org.apache.drill.exec.work
>>>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:284)
>>>>        at .......(:0)
>>>>        at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
>>>>        at org.apache.drill.exec.work
>>>> .fragment.FragmentExecutor.run(FragmentExecutor.java:284)
>>>>        at
>>>> org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38)
>>>>        at .......(:0)
>>>> 
>>>> 
>>>> If you run a query w/o long fields it works fine:
>>>> dbc:drill:zk=local> select * from dfs.test.`logs.syslog1`;
>>>> 
>>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>>> |        event_date        | severity_code  | facility_code  | severity
>>>> | facility  |           ip           | app_name  | message_id  |
>>>>        message                     | process_id  |
>>>>       structured_data_text                               |
>>>>             structured_data                              |
>>>> 
>>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>                                                             | {}
>>>>                                                             |
>>>> | 1985-04-12 23:20:50.52   | 2              | 4              | CRIT
>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>                                                             | {}
>>>>                                                             |
>>>> | 1985-04-12 23:20:50.52   | 2              | 4              | CRIT
>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>                                                             | {}
>>>>                                                             |
>>>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>                                                             | {}
>>>>                                                             |
>>>> | 2003-08-24 12:14:15.0    | 2              | 4              | CRIT
>>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>>                                                             | {}
>>>>                                                             |
>>>> | 2003-08-24 12:14:15.0    | 5              | 20             | NOTICE
>>>> | LOCAL4    | 192.0.2.1              | myproc    | null        | %% It's
>>>> time to make the do-nuts.              | 8710        | null
>>>>                                                             | {}
>>>>                                                             |
>>>> | 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE
>>>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | null
>>>>                                         | null        |
>>>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
>>>> eventSource=Application, eventID=1011]} |
>>>> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
>>>> | 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE
>>>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | - and
>>>> thats a wrap!                            | null        |
>>>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
>>>> eventSource=Application, eventID=1011]} |
>>>> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
>>>> 
>>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>>> 8 rows selected (1.702 seconds)
>>>> 
>>>> 
>>>>> On Jan 24, 2019, at 17:02, Karthikeyan Manivannan <
>>>> kmanivannan@mapr.com> wrote:
>>>>> 
>>>>> Hi Charles,
>>>>> 
>>>>> Can you please provide the stack trace.
>>>>> 
>>>>> Thanks
>>>>> 
>>>>> Karthik
>>>>> 
>>>>> On Tue, Jan 22, 2019 at 9:18 PM Charles Givre <cg...@gmail.com>
>>>> wrote:
>>>>> 
>>>>>> Hello all,
>>>>>> I’m working on a format plugin to read syslog data, and have
>>>> encountered
>>>>>> what seems to be a bit of a regression (maybe).   The code below is a
>>>>>> helper function which writes strings from the data.  As of Drill
>>>> 1.16, the
>>>>>> varchar holder seems to throw an error if the string you are trying to
>>>>>> write is > 256 characters.  Is there a workaround?
>>>>>> 
>>>>>> Thanks!
>>>>>> 
>>>>>> 
>>>>>> //Helper function to map strings
>>>>>> private void mapStringField(String name, String value,
>>>>>> BaseWriter.MapWriter map) {
>>>>>> if (value == null) {
>>>>>>   return;
>>>>>> }
>>>>>> try {
>>>>>>   byte[] bytes = value.getBytes("UTF-8");
>>>>>>   int stringLength = bytes.length;
>>>>>>   this.buffer.setBytes(0, bytes, 0, stringLength);
>>>>>>   map.varChar(name).writeVarChar(0, stringLength, buffer);
>>>>>> } catch (Exception e) {
>>>>>>   throw UserException
>>>>>>           .dataWriteError()
>>>>>>           .addContext("Could not write string: ")
>>>>>>           .addContext(e.getMessage())
>>>>>>           .build(logger);
>>>>>> }
>>>>>> }
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>>>> 


Re: Regression? Drill Truncating Varchars

Posted by Karthikeyan Manivannan <km...@mapr.com>.
Your buffer is allocated as

this.buffer = context.getManagedBuffer();

and that boils down to

BufferManagerImpl.java

@Override
public DrillBuf getManagedBuffer() {
  return getManagedBuffer(256);
}

 This maybe the root of the problem.

Thanks.

Karthik

On Fri, Jan 25, 2019 at 3:02 PM Karthikeyan Manivannan <km...@mapr.com>
wrote:

> Hi Charles,
>
> In the code that you had pasted
>
> this.buffer.setBytes(0, bytes, 0, stringLength);
>
> what guarantees that this.buffer has enough space for stringLength chars?
>
> Karthik
>
> On Fri, Jan 25, 2019 at 2:32 PM Karthikeyan Manivannan <
> kmanivannan@mapr.com> wrote:
>
>> Charles,
>>
>> Does this work on 1.15 ?
>>
>> Drill 1.16 is able to correctly read CSV files with > 256 char strings,
>> so I guess the problem might be in the Syslog plugin code.
>> Can you share your format plugin code ?
>>
>> Thanks.
>>
>> Karthik
>>
>>
>> On Thu, Jan 24, 2019 at 4:24 PM Charles Givre <cg...@gmail.com> wrote:
>>
>>> Here you go… Thanks for your help!
>>>
>>>
>>> SELECT
>>> event_date,severity_code,facility_code,severity,facility,ip,app_name,process_id,message_id,structured_data_text,structured_data_UserAgent,structured_data_UserHostAddress,structured_data_BrowserSession,structured_data_Realm,structured_data_Appliance,structured_data_Company,structured_data_UserID,structured_data_PEN,structured_data_HostName,structured_data_Category,structured_data_Priority,message
>>> FROM cp.`syslog/test.syslog1`
>>> 18:16:50.061 [main] ERROR org.apache.drill.TestReporter - Test Failed
>>> (d: 0 B(1 B), h: 21.6 MiB(130.9 MiB), nh: 2.0 MiB(82.2 MiB)):
>>> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>> java.lang.NullPointerException: null
>>>         at
>>> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
>>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>>>         at
>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
>>> ~[test-classes/:na]
>>>         at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
>>> 18:16:50.571 [main] ERROR org.apache.drill.TestReporter - Test Failed
>>> (d: 0 B(1 B), h: 20.5 MiB(209.9 MiB), nh: 411.1 KiB(85.4 MiB)):
>>> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>> java.lang.NullPointerException: null
>>>         at
>>> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
>>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>>>         at
>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
>>> ~[test-classes/:na]
>>>         at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
>>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0, Time elapsed:
>>> 3.173 s <<< FAILURE! - in
>>> org.apache.drill.exec.store.syslog.TestSyslogFormat
>>> [ERROR]
>>> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>> Time elapsed: 0.234 s  <<< ERROR!
>>> java.lang.NullPointerException
>>>         at
>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
>>>
>>> [ERROR]
>>> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>>> Time elapsed: 0.125 s  <<< ERROR!
>>> java.lang.NullPointerException
>>>         at
>>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
>>>
>>> [INFO]
>>> [INFO] Results:
>>> [INFO]
>>> [ERROR] Errors:
>>> [ERROR]   TestSyslogFormat.testExplicitFlattenedStructuredDataQuery:310
>>> » NullPointer
>>> [ERROR]   TestSyslogFormat.testStarFlattenedStructuredDataQuery:248 »
>>> NullPointer
>>> [INFO]
>>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0
>>> [INFO]
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] BUILD FAILURE
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] Total time:  33.919 s
>>> [INFO] Finished at: 2019-01-24T18:16:51-05:00
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [ERROR] Failed to execute goal
>>> org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test (default-test)
>>> on project drill-format-syslog: There are test failures.
>>> [ERROR]
>>> [ERROR] Please refer to
>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>> for the individual test results.
>>> [ERROR] Please refer to dump files (if any exist) [date].dump,
>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>> [ERROR] -> [Help 1]
>>> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to
>>> execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test
>>> (default-test) on project drill-format-syslog: There are test failures.
>>>
>>> Please refer to
>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>> for the individual test results.
>>> Please refer to dump files (if any exist) [date].dump,
>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>> (MojoExecutor.java:215)
>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>> (MojoExecutor.java:156)
>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>> (MojoExecutor.java:148)
>>>     at
>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>> (LifecycleModuleBuilder.java:117)
>>>     at
>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>> (LifecycleModuleBuilder.java:81)
>>>     at
>>> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>>> (SingleThreadedBuilder.java:56)
>>>     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
>>> (LifecycleStarter.java:128)
>>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>>>     at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>>>     at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>>>     at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>>>     at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke
>>> (NativeMethodAccessorImpl.java:62)
>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke
>>> (DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke (Method.java:497)
>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
>>> (Launcher.java:289)
>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launch
>>> (Launcher.java:229)
>>>     at
>>> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
>>> (Launcher.java:415)
>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.main
>>> (Launcher.java:356)
>>> Caused by: org.apache.maven.plugin.MojoFailureException: There are test
>>> failures.
>>>
>>> Please refer to
>>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>>> for the individual test results.
>>> Please refer to dump files (if any exist) [date].dump,
>>> [date]-jvmRun[N].dump and [date].dumpstream.
>>>     at org.apache.maven.plugin.surefire.SurefireHelper.throwException
>>> (SurefireHelper.java:271)
>>>     at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution
>>> (SurefireHelper.java:159)
>>>     at org.apache.maven.plugin.surefire.SurefirePlugin.handleSummary
>>> (SurefirePlugin.java:373)
>>>     at
>>> org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked
>>> (AbstractSurefireMojo.java:1016)
>>>     at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute
>>> (AbstractSurefireMojo.java:846)
>>>     at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo
>>> (DefaultBuildPluginManager.java:137)
>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>> (MojoExecutor.java:210)
>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>> (MojoExecutor.java:156)
>>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>>> (MojoExecutor.java:148)
>>>     at
>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>> (LifecycleModuleBuilder.java:117)
>>>     at
>>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>>> (LifecycleModuleBuilder.java:81)
>>>     at
>>> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>>> (SingleThreadedBuilder.java:56)
>>>     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
>>> (LifecycleStarter.java:128)
>>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>>>     at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>>>     at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>>>     at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>>>     at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke
>>> (NativeMethodAccessorImpl.java:62)
>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke
>>> (DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke (Method.java:497)
>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
>>> (Launcher.java:289)
>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launch
>>> (Launcher.java:229)
>>>     at
>>> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
>>> (Launcher.java:415)
>>>     at org.codehaus.plexus.classworlds.launcher.Launcher.main
>>> (Launcher.java:356)
>>> [ERROR]
>>> [ERROR]
>>> [ERROR] For more information about the errors and possible solutions,
>>> please read the following articles:
>>> [ERROR] [Help 1]
>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__cwiki.apache.org_confluence_display_MAVEN_MojoFailureException&d=DwIFaQ&c=cskdkSMqhcnjZxdQVpwTXg&r=HlugibuI4IVjs-VMnFvNTcaBtEaDDqE4Ya96cugWqJ8&m=FYH3iU2DaKxg6aW__bi0KbNTl6lqB4g5vyB4-lVHbPk&s=dp6wxwSeKJ-a87PHGpnBjn7aOhg4DNJucCXguyddfMQ&e=
>>>
>>>
>>> If you run the query in sqline, here is the result:
>>>
>>>
>>> jdbc:drill:zk=local> select * from dfs.test.`test.syslog1`;
>>> Error: DATA_READ ERROR: Error parsing file
>>>
>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>
>>> Line: 1
>>> DATA_WRITE ERROR: null
>>>
>>> Could not write string:
>>> index: 0, length: 342 (expected: range(0, 256))
>>>
>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>
>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>> Fragment 0:0
>>>
>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>> (state=,code=0)
>>> java.sql.SQLException: DATA_READ ERROR: Error parsing file
>>>
>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>
>>> Line: 1
>>> DATA_WRITE ERROR: null
>>>
>>> Could not write string:
>>> index: 0, length: 342 (expected: range(0, 256))
>>>
>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>
>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>> Fragment 0:0
>>>
>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>         at
>>> org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.java:536)
>>>         at
>>> org.apache.drill.jdbc.impl.DrillCursor.loadInitialSchema(DrillCursor.java:608)
>>>         at
>>> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:1288)
>>>         at
>>> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:61)
>>>         at
>>> org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
>>>         at
>>> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1107)
>>>         at
>>> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1118)
>>>         at
>>> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
>>>         at
>>> org.apache.drill.jdbc.impl.DrillConnectionImpl.prepareAndExecuteInternal(DrillConnectionImpl.java:200)
>>>         at
>>> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
>>>         at
>>> org.apache.calcite.avatica.AvaticaStatement.execute(AvaticaStatement.java:217)
>>>         at sqlline.Commands.execute(Commands.java:938)
>>>         at sqlline.Commands.sql(Commands.java:882)
>>>         at sqlline.SqlLine.dispatch(SqlLine.java:725)
>>>         at sqlline.SqlLine.begin(SqlLine.java:540)
>>>         at sqlline.SqlLine.start(SqlLine.java:264)
>>>         at sqlline.SqlLine.main(SqlLine.java:195)
>>> Caused by: org.apache.drill.common.exceptions.UserRemoteException:
>>> DATA_READ ERROR: Error parsing file
>>>
>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>
>>> Line: 1
>>> DATA_WRITE ERROR: null
>>>
>>> Could not write string:
>>> index: 0, length: 342 (expected: range(0, 256))
>>>
>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>
>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>> Fragment 0:0
>>>
>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>         at
>>> org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123)
>>>         at
>>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:422)
>>>         at
>>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:96)
>>>         at
>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:273)
>>>         at
>>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:243)
>>>         at
>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>         at
>>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>         at
>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>         at
>>> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:312)
>>>         at
>>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:286)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>         at
>>> io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>>         at
>>> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>>         at
>>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
>>>         at
>>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>>>         at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
>>>         at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
>>>         at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
>>>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
>>>         at
>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
>>>         at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.Exception: DATA_READ ERROR: Error parsing file
>>>
>>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>>
>>> Line: 1
>>> DATA_WRITE ERROR: null
>>>
>>> Could not write string:
>>> index: 0, length: 342 (expected: range(0, 256))
>>>
>>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>>
>>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>>> Fragment 0:0
>>>
>>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>>         at
>>> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:633)
>>>         at
>>> org.apache.drill.exec.store.syslog.SyslogRecordReader.next(SyslogRecordReader.java:169)
>>>         at
>>> org.apache.drill.exec.physical.impl.ScanBatch.internalNext(ScanBatch.java:223)
>>>         at
>>> org.apache.drill.exec.physical.impl.ScanBatch.next(ScanBatch.java:271)
>>>         at
>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:126)
>>>         at
>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:116)
>>>         at
>>> org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext(AbstractUnaryRecordBatch.java:63)
>>>         at
>>> org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:143)
>>>         at
>>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:186)
>>>         at
>>> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:104)
>>>         at
>>> org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:83)
>>>         at
>>> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:94)
>>>         at org.apache.drill.exec.work
>>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:297)
>>>         at org.apache.drill.exec.work
>>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:284)
>>>         at .......(:0)
>>>         at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
>>>         at org.apache.drill.exec.work
>>> .fragment.FragmentExecutor.run(FragmentExecutor.java:284)
>>>         at
>>> org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38)
>>>         at .......(:0)
>>>
>>>
>>> If you run a query w/o long fields it works fine:
>>> dbc:drill:zk=local> select * from dfs.test.`logs.syslog1`;
>>>
>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>> |        event_date        | severity_code  | facility_code  | severity
>>> | facility  |           ip           | app_name  | message_id  |
>>>         message                     | process_id  |
>>>        structured_data_text                               |
>>>              structured_data                              |
>>>
>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>                                                              | {}
>>>                                                              |
>>> | 1985-04-12 23:20:50.52   | 2              | 4              | CRIT
>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>                                                              | {}
>>>                                                              |
>>> | 1985-04-12 23:20:50.52   | 2              | 4              | CRIT
>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>                                                              | {}
>>>                                                              |
>>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>                                                              | {}
>>>                                                              |
>>> | 2003-08-24 12:14:15.0    | 2              | 4              | CRIT
>>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>>> root' failed for lonvick on /dev/pts/8  | null        | null
>>>                                                              | {}
>>>                                                              |
>>> | 2003-08-24 12:14:15.0    | 5              | 20             | NOTICE
>>> | LOCAL4    | 192.0.2.1              | myproc    | null        | %% It's
>>> time to make the do-nuts.              | 8710        | null
>>>                                                              | {}
>>>                                                              |
>>> | 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE
>>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | null
>>>                                          | null        |
>>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
>>> eventSource=Application, eventID=1011]} |
>>> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
>>> | 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE
>>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | - and
>>> thats a wrap!                            | null        |
>>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
>>> eventSource=Application, eventID=1011]} |
>>> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
>>>
>>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>>> 8 rows selected (1.702 seconds)
>>>
>>>
>>> > On Jan 24, 2019, at 17:02, Karthikeyan Manivannan <
>>> kmanivannan@mapr.com> wrote:
>>> >
>>> > Hi Charles,
>>> >
>>> > Can you please provide the stack trace.
>>> >
>>> > Thanks
>>> >
>>> > Karthik
>>> >
>>> > On Tue, Jan 22, 2019 at 9:18 PM Charles Givre <cg...@gmail.com>
>>> wrote:
>>> >
>>> >> Hello all,
>>> >> I’m working on a format plugin to read syslog data, and have
>>> encountered
>>> >> what seems to be a bit of a regression (maybe).   The code below is a
>>> >> helper function which writes strings from the data.  As of Drill
>>> 1.16, the
>>> >> varchar holder seems to throw an error if the string you are trying to
>>> >> write is > 256 characters.  Is there a workaround?
>>> >>
>>> >> Thanks!
>>> >>
>>> >>
>>> >> //Helper function to map strings
>>> >> private void mapStringField(String name, String value,
>>> >> BaseWriter.MapWriter map) {
>>> >>  if (value == null) {
>>> >>    return;
>>> >>  }
>>> >>  try {
>>> >>    byte[] bytes = value.getBytes("UTF-8");
>>> >>    int stringLength = bytes.length;
>>> >>    this.buffer.setBytes(0, bytes, 0, stringLength);
>>> >>    map.varChar(name).writeVarChar(0, stringLength, buffer);
>>> >>  } catch (Exception e) {
>>> >>    throw UserException
>>> >>            .dataWriteError()
>>> >>            .addContext("Could not write string: ")
>>> >>            .addContext(e.getMessage())
>>> >>            .build(logger);
>>> >>  }
>>> >> }
>>> >>
>>> >>
>>> >>
>>> >>
>>> >>
>>>
>>>

Re: Regression? Drill Truncating Varchars

Posted by Karthikeyan Manivannan <km...@mapr.com>.
Hi Charles,

In the code that you had pasted

this.buffer.setBytes(0, bytes, 0, stringLength);

what guarantees that this.buffer has enough space for stringLength chars?

Karthik

On Fri, Jan 25, 2019 at 2:32 PM Karthikeyan Manivannan <km...@mapr.com>
wrote:

> Charles,
>
> Does this work on 1.15 ?
>
> Drill 1.16 is able to correctly read CSV files with > 256 char strings, so
> I guess the problem might be in the Syslog plugin code.
> Can you share your format plugin code ?
>
> Thanks.
>
> Karthik
>
>
> On Thu, Jan 24, 2019 at 4:24 PM Charles Givre <cg...@gmail.com> wrote:
>
>> Here you go… Thanks for your help!
>>
>>
>> SELECT
>> event_date,severity_code,facility_code,severity,facility,ip,app_name,process_id,message_id,structured_data_text,structured_data_UserAgent,structured_data_UserHostAddress,structured_data_BrowserSession,structured_data_Realm,structured_data_Appliance,structured_data_Company,structured_data_UserID,structured_data_PEN,structured_data_HostName,structured_data_Category,structured_data_Priority,message
>> FROM cp.`syslog/test.syslog1`
>> 18:16:50.061 [main] ERROR org.apache.drill.TestReporter - Test Failed (d:
>> 0 B(1 B), h: 21.6 MiB(130.9 MiB), nh: 2.0 MiB(82.2 MiB)):
>> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>> java.lang.NullPointerException: null
>>         at
>> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>>         at
>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
>> ~[test-classes/:na]
>>         at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
>> 18:16:50.571 [main] ERROR org.apache.drill.TestReporter - Test Failed (d:
>> 0 B(1 B), h: 20.5 MiB(209.9 MiB), nh: 411.1 KiB(85.4 MiB)):
>> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>> java.lang.NullPointerException: null
>>         at
>> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
>> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>>         at
>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
>> ~[test-classes/:na]
>>         at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0, Time elapsed:
>> 3.173 s <<< FAILURE! - in
>> org.apache.drill.exec.store.syslog.TestSyslogFormat
>> [ERROR]
>> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>> Time elapsed: 0.234 s  <<< ERROR!
>> java.lang.NullPointerException
>>         at
>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
>>
>> [ERROR]
>> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
>> Time elapsed: 0.125 s  <<< ERROR!
>> java.lang.NullPointerException
>>         at
>> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
>>
>> [INFO]
>> [INFO] Results:
>> [INFO]
>> [ERROR] Errors:
>> [ERROR]   TestSyslogFormat.testExplicitFlattenedStructuredDataQuery:310 »
>> NullPointer
>> [ERROR]   TestSyslogFormat.testStarFlattenedStructuredDataQuery:248 »
>> NullPointer
>> [INFO]
>> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0
>> [INFO]
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] BUILD FAILURE
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] Total time:  33.919 s
>> [INFO] Finished at: 2019-01-24T18:16:51-05:00
>> [INFO]
>> ------------------------------------------------------------------------
>> [ERROR] Failed to execute goal
>> org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test (default-test)
>> on project drill-format-syslog: There are test failures.
>> [ERROR]
>> [ERROR] Please refer to
>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>> for the individual test results.
>> [ERROR] Please refer to dump files (if any exist) [date].dump,
>> [date]-jvmRun[N].dump and [date].dumpstream.
>> [ERROR] -> [Help 1]
>> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
>> goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test
>> (default-test) on project drill-format-syslog: There are test failures.
>>
>> Please refer to
>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>> for the individual test results.
>> Please refer to dump files (if any exist) [date].dump,
>> [date]-jvmRun[N].dump and [date].dumpstream.
>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>> (MojoExecutor.java:215)
>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>> (MojoExecutor.java:156)
>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>> (MojoExecutor.java:148)
>>     at
>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>> (LifecycleModuleBuilder.java:117)
>>     at
>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>> (LifecycleModuleBuilder.java:81)
>>     at
>> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>> (SingleThreadedBuilder.java:56)
>>     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
>> (LifecycleStarter.java:128)
>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>>     at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>>     at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>>     at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>>     at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke
>> (NativeMethodAccessorImpl.java:62)
>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke
>> (DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke (Method.java:497)
>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
>> (Launcher.java:289)
>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launch
>> (Launcher.java:229)
>>     at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
>> (Launcher.java:415)
>>     at org.codehaus.plexus.classworlds.launcher.Launcher.main
>> (Launcher.java:356)
>> Caused by: org.apache.maven.plugin.MojoFailureException: There are test
>> failures.
>>
>> Please refer to
>> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
>> for the individual test results.
>> Please refer to dump files (if any exist) [date].dump,
>> [date]-jvmRun[N].dump and [date].dumpstream.
>>     at org.apache.maven.plugin.surefire.SurefireHelper.throwException
>> (SurefireHelper.java:271)
>>     at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution
>> (SurefireHelper.java:159)
>>     at org.apache.maven.plugin.surefire.SurefirePlugin.handleSummary
>> (SurefirePlugin.java:373)
>>     at
>> org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked
>> (AbstractSurefireMojo.java:1016)
>>     at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute
>> (AbstractSurefireMojo.java:846)
>>     at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo
>> (DefaultBuildPluginManager.java:137)
>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>> (MojoExecutor.java:210)
>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>> (MojoExecutor.java:156)
>>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
>> (MojoExecutor.java:148)
>>     at
>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>> (LifecycleModuleBuilder.java:117)
>>     at
>> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
>> (LifecycleModuleBuilder.java:81)
>>     at
>> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
>> (SingleThreadedBuilder.java:56)
>>     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
>> (LifecycleStarter.java:128)
>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>>     at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>>     at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>>     at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>>     at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke
>> (NativeMethodAccessorImpl.java:62)
>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke
>> (DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke (Method.java:497)
>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
>> (Launcher.java:289)
>>     at org.codehaus.plexus.classworlds.launcher.Launcher.launch
>> (Launcher.java:229)
>>     at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
>> (Launcher.java:415)
>>     at org.codehaus.plexus.classworlds.launcher.Launcher.main
>> (Launcher.java:356)
>> [ERROR]
>> [ERROR]
>> [ERROR] For more information about the errors and possible solutions,
>> please read the following articles:
>> [ERROR] [Help 1]
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__cwiki.apache.org_confluence_display_MAVEN_MojoFailureException&d=DwIFaQ&c=cskdkSMqhcnjZxdQVpwTXg&r=HlugibuI4IVjs-VMnFvNTcaBtEaDDqE4Ya96cugWqJ8&m=FYH3iU2DaKxg6aW__bi0KbNTl6lqB4g5vyB4-lVHbPk&s=dp6wxwSeKJ-a87PHGpnBjn7aOhg4DNJucCXguyddfMQ&e=
>>
>>
>> If you run the query in sqline, here is the result:
>>
>>
>> jdbc:drill:zk=local> select * from dfs.test.`test.syslog1`;
>> Error: DATA_READ ERROR: Error parsing file
>>
>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>
>> Line: 1
>> DATA_WRITE ERROR: null
>>
>> Could not write string:
>> index: 0, length: 342 (expected: range(0, 256))
>>
>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>
>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>> Fragment 0:0
>>
>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>> (state=,code=0)
>> java.sql.SQLException: DATA_READ ERROR: Error parsing file
>>
>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>
>> Line: 1
>> DATA_WRITE ERROR: null
>>
>> Could not write string:
>> index: 0, length: 342 (expected: range(0, 256))
>>
>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>
>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>> Fragment 0:0
>>
>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>         at
>> org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.java:536)
>>         at
>> org.apache.drill.jdbc.impl.DrillCursor.loadInitialSchema(DrillCursor.java:608)
>>         at
>> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:1288)
>>         at
>> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:61)
>>         at
>> org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
>>         at
>> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1107)
>>         at
>> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1118)
>>         at
>> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
>>         at
>> org.apache.drill.jdbc.impl.DrillConnectionImpl.prepareAndExecuteInternal(DrillConnectionImpl.java:200)
>>         at
>> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
>>         at
>> org.apache.calcite.avatica.AvaticaStatement.execute(AvaticaStatement.java:217)
>>         at sqlline.Commands.execute(Commands.java:938)
>>         at sqlline.Commands.sql(Commands.java:882)
>>         at sqlline.SqlLine.dispatch(SqlLine.java:725)
>>         at sqlline.SqlLine.begin(SqlLine.java:540)
>>         at sqlline.SqlLine.start(SqlLine.java:264)
>>         at sqlline.SqlLine.main(SqlLine.java:195)
>> Caused by: org.apache.drill.common.exceptions.UserRemoteException:
>> DATA_READ ERROR: Error parsing file
>>
>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>
>> Line: 1
>> DATA_WRITE ERROR: null
>>
>> Could not write string:
>> index: 0, length: 342 (expected: range(0, 256))
>>
>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>
>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>> Fragment 0:0
>>
>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>         at
>> org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123)
>>         at
>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:422)
>>         at
>> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:96)
>>         at
>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:273)
>>         at
>> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:243)
>>         at
>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>         at
>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>         at
>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>         at
>> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:312)
>>         at
>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:286)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>         at
>> io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>>         at
>> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>>         at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>>         at
>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
>>         at
>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>>         at
>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
>>         at
>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
>>         at
>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
>>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
>>         at
>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
>>         at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.Exception: DATA_READ ERROR: Error parsing file
>>
>> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>>
>> Line: 1
>> DATA_WRITE ERROR: null
>>
>> Could not write string:
>> index: 0, length: 342 (expected: range(0, 256))
>>
>> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>>
>> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
>> Fragment 0:0
>>
>> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>>         at
>> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:633)
>>         at
>> org.apache.drill.exec.store.syslog.SyslogRecordReader.next(SyslogRecordReader.java:169)
>>         at
>> org.apache.drill.exec.physical.impl.ScanBatch.internalNext(ScanBatch.java:223)
>>         at
>> org.apache.drill.exec.physical.impl.ScanBatch.next(ScanBatch.java:271)
>>         at
>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:126)
>>         at
>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:116)
>>         at
>> org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext(AbstractUnaryRecordBatch.java:63)
>>         at
>> org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:143)
>>         at
>> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:186)
>>         at
>> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:104)
>>         at
>> org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:83)
>>         at
>> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:94)
>>         at org.apache.drill.exec.work
>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:297)
>>         at org.apache.drill.exec.work
>> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:284)
>>         at .......(:0)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
>>         at org.apache.drill.exec.work
>> .fragment.FragmentExecutor.run(FragmentExecutor.java:284)
>>         at
>> org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38)
>>         at .......(:0)
>>
>>
>> If you run a query w/o long fields it works fine:
>> dbc:drill:zk=local> select * from dfs.test.`logs.syslog1`;
>>
>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>> |        event_date        | severity_code  | facility_code  | severity
>> | facility  |           ip           | app_name  | message_id  |
>>         message                     | process_id  |
>>        structured_data_text                               |
>>              structured_data                              |
>>
>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>> root' failed for lonvick on /dev/pts/8  | null        | null
>>                                                              | {}
>>                                                              |
>> | 1985-04-12 23:20:50.52   | 2              | 4              | CRIT
>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>> root' failed for lonvick on /dev/pts/8  | null        | null
>>                                                              | {}
>>                                                              |
>> | 1985-04-12 23:20:50.52   | 2              | 4              | CRIT
>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>> root' failed for lonvick on /dev/pts/8  | null        | null
>>                                                              | {}
>>                                                              |
>> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT
>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>> root' failed for lonvick on /dev/pts/8  | null        | null
>>                                                              | {}
>>                                                              |
>> | 2003-08-24 12:14:15.0    | 2              | 4              | CRIT
>> | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
>> root' failed for lonvick on /dev/pts/8  | null        | null
>>                                                              | {}
>>                                                              |
>> | 2003-08-24 12:14:15.0    | 5              | 20             | NOTICE
>> | LOCAL4    | 192.0.2.1              | myproc    | null        | %% It's
>> time to make the do-nuts.              | 8710        | null
>>                                                              | {}
>>                                                              |
>> | 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE
>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | null
>>                                        | null        |
>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
>> eventSource=Application, eventID=1011]} |
>> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
>> | 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE
>> | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | - and
>> thats a wrap!                            | null        |
>> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
>> eventSource=Application, eventID=1011]} |
>> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
>>
>> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
>> 8 rows selected (1.702 seconds)
>>
>>
>> > On Jan 24, 2019, at 17:02, Karthikeyan Manivannan <km...@mapr.com>
>> wrote:
>> >
>> > Hi Charles,
>> >
>> > Can you please provide the stack trace.
>> >
>> > Thanks
>> >
>> > Karthik
>> >
>> > On Tue, Jan 22, 2019 at 9:18 PM Charles Givre <cg...@gmail.com> wrote:
>> >
>> >> Hello all,
>> >> I’m working on a format plugin to read syslog data, and have
>> encountered
>> >> what seems to be a bit of a regression (maybe).   The code below is a
>> >> helper function which writes strings from the data.  As of Drill 1.16,
>> the
>> >> varchar holder seems to throw an error if the string you are trying to
>> >> write is > 256 characters.  Is there a workaround?
>> >>
>> >> Thanks!
>> >>
>> >>
>> >> //Helper function to map strings
>> >> private void mapStringField(String name, String value,
>> >> BaseWriter.MapWriter map) {
>> >>  if (value == null) {
>> >>    return;
>> >>  }
>> >>  try {
>> >>    byte[] bytes = value.getBytes("UTF-8");
>> >>    int stringLength = bytes.length;
>> >>    this.buffer.setBytes(0, bytes, 0, stringLength);
>> >>    map.varChar(name).writeVarChar(0, stringLength, buffer);
>> >>  } catch (Exception e) {
>> >>    throw UserException
>> >>            .dataWriteError()
>> >>            .addContext("Could not write string: ")
>> >>            .addContext(e.getMessage())
>> >>            .build(logger);
>> >>  }
>> >> }
>> >>
>> >>
>> >>
>> >>
>> >>
>>
>>

Re: Regression? Drill Truncating Varchars

Posted by Karthikeyan Manivannan <km...@mapr.com>.
Charles,

Does this work on 1.15 ?

Drill 1.16 is able to correctly read CSV files with > 256 char strings, so
I guess the problem might be in the Syslog plugin code.
Can you share your format plugin code ?

Thanks.

Karthik


On Thu, Jan 24, 2019 at 4:24 PM Charles Givre <cg...@gmail.com> wrote:

> Here you go… Thanks for your help!
>
>
> SELECT
> event_date,severity_code,facility_code,severity,facility,ip,app_name,process_id,message_id,structured_data_text,structured_data_UserAgent,structured_data_UserHostAddress,structured_data_BrowserSession,structured_data_Realm,structured_data_Appliance,structured_data_Company,structured_data_UserID,structured_data_PEN,structured_data_HostName,structured_data_Category,structured_data_Priority,message
> FROM cp.`syslog/test.syslog1`
> 18:16:50.061 [main] ERROR org.apache.drill.TestReporter - Test Failed (d:
> 0 B(1 B), h: 21.6 MiB(130.9 MiB), nh: 2.0 MiB(82.2 MiB)):
> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
> java.lang.NullPointerException: null
>         at
> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>         at
> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
> ~[test-classes/:na]
>         at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
> 18:16:50.571 [main] ERROR org.apache.drill.TestReporter - Test Failed (d:
> 0 B(1 B), h: 20.5 MiB(209.9 MiB), nh: 411.1 KiB(85.4 MiB)):
> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
> java.lang.NullPointerException: null
>         at
> org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276)
> ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
>         at
> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
> ~[test-classes/:na]
>         at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0, Time elapsed:
> 3.173 s <<< FAILURE! - in
> org.apache.drill.exec.store.syslog.TestSyslogFormat
> [ERROR]
> testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
> Time elapsed: 0.234 s  <<< ERROR!
> java.lang.NullPointerException
>         at
> org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)
>
> [ERROR]
> testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
> Time elapsed: 0.125 s  <<< ERROR!
> java.lang.NullPointerException
>         at
> org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)
>
> [INFO]
> [INFO] Results:
> [INFO]
> [ERROR] Errors:
> [ERROR]   TestSyslogFormat.testExplicitFlattenedStructuredDataQuery:310 »
> NullPointer
> [ERROR]   TestSyslogFormat.testStarFlattenedStructuredDataQuery:248 »
> NullPointer
> [INFO]
> [ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0
> [INFO]
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time:  33.919 s
> [INFO] Finished at: 2019-01-24T18:16:51-05:00
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test (default-test)
> on project drill-format-syslog: There are test failures.
> [ERROR]
> [ERROR] Please refer to
> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
> for the individual test results.
> [ERROR] Please refer to dump files (if any exist) [date].dump,
> [date]-jvmRun[N].dump and [date].dumpstream.
> [ERROR] -> [Help 1]
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
> goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test
> (default-test) on project drill-format-syslog: There are test failures.
>
> Please refer to
> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
> for the individual test results.
> Please refer to dump files (if any exist) [date].dump,
> [date]-jvmRun[N].dump and [date].dumpstream.
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> (MojoExecutor.java:215)
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> (MojoExecutor.java:156)
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> (MojoExecutor.java:148)
>     at
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
> (LifecycleModuleBuilder.java:117)
>     at
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
> (LifecycleModuleBuilder.java:81)
>     at
> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
> (SingleThreadedBuilder.java:56)
>     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
> (LifecycleStarter.java:128)
>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>     at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>     at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>     at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>     at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:497)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
> (Launcher.java:289)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.launch
> (Launcher.java:229)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
> (Launcher.java:415)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.main
> (Launcher.java:356)
> Caused by: org.apache.maven.plugin.MojoFailureException: There are test
> failures.
>
> Please refer to
> /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports
> for the individual test results.
> Please refer to dump files (if any exist) [date].dump,
> [date]-jvmRun[N].dump and [date].dumpstream.
>     at org.apache.maven.plugin.surefire.SurefireHelper.throwException
> (SurefireHelper.java:271)
>     at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution
> (SurefireHelper.java:159)
>     at org.apache.maven.plugin.surefire.SurefirePlugin.handleSummary
> (SurefirePlugin.java:373)
>     at
> org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked
> (AbstractSurefireMojo.java:1016)
>     at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute
> (AbstractSurefireMojo.java:846)
>     at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo
> (DefaultBuildPluginManager.java:137)
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> (MojoExecutor.java:210)
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> (MojoExecutor.java:156)
>     at org.apache.maven.lifecycle.internal.MojoExecutor.execute
> (MojoExecutor.java:148)
>     at
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
> (LifecycleModuleBuilder.java:117)
>     at
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject
> (LifecycleModuleBuilder.java:81)
>     at
> org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build
> (SingleThreadedBuilder.java:56)
>     at org.apache.maven.lifecycle.internal.LifecycleStarter.execute
> (LifecycleStarter.java:128)
>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
>     at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
>     at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
>     at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
>     at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
>     at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke
> (NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke
> (DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke (Method.java:497)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced
> (Launcher.java:289)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.launch
> (Launcher.java:229)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode
> (Launcher.java:415)
>     at org.codehaus.plexus.classworlds.launcher.Launcher.main
> (Launcher.java:356)
> [ERROR]
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1]
> https://urldefense.proofpoint.com/v2/url?u=http-3A__cwiki.apache.org_confluence_display_MAVEN_MojoFailureException&d=DwIFaQ&c=cskdkSMqhcnjZxdQVpwTXg&r=HlugibuI4IVjs-VMnFvNTcaBtEaDDqE4Ya96cugWqJ8&m=FYH3iU2DaKxg6aW__bi0KbNTl6lqB4g5vyB4-lVHbPk&s=dp6wxwSeKJ-a87PHGpnBjn7aOhg4DNJucCXguyddfMQ&e=
>
>
> If you run the query in sqline, here is the result:
>
>
> jdbc:drill:zk=local> select * from dfs.test.`test.syslog1`;
> Error: DATA_READ ERROR: Error parsing file
>
> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>
> Line: 1
> DATA_WRITE ERROR: null
>
> Could not write string:
> index: 0, length: 342 (expected: range(0, 256))
>
> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>
> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
> Fragment 0:0
>
> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
> (state=,code=0)
> java.sql.SQLException: DATA_READ ERROR: Error parsing file
>
> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>
> Line: 1
> DATA_WRITE ERROR: null
>
> Could not write string:
> index: 0, length: 342 (expected: range(0, 256))
>
> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>
> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
> Fragment 0:0
>
> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>         at
> org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.java:536)
>         at
> org.apache.drill.jdbc.impl.DrillCursor.loadInitialSchema(DrillCursor.java:608)
>         at
> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:1288)
>         at
> org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:61)
>         at
> org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
>         at
> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1107)
>         at
> org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1118)
>         at
> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
>         at
> org.apache.drill.jdbc.impl.DrillConnectionImpl.prepareAndExecuteInternal(DrillConnectionImpl.java:200)
>         at
> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
>         at
> org.apache.calcite.avatica.AvaticaStatement.execute(AvaticaStatement.java:217)
>         at sqlline.Commands.execute(Commands.java:938)
>         at sqlline.Commands.sql(Commands.java:882)
>         at sqlline.SqlLine.dispatch(SqlLine.java:725)
>         at sqlline.SqlLine.begin(SqlLine.java:540)
>         at sqlline.SqlLine.start(SqlLine.java:264)
>         at sqlline.SqlLine.main(SqlLine.java:195)
> Caused by: org.apache.drill.common.exceptions.UserRemoteException:
> DATA_READ ERROR: Error parsing file
>
> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>
> Line: 1
> DATA_WRITE ERROR: null
>
> Could not write string:
> index: 0, length: 342 (expected: range(0, 256))
>
> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>
> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
> Fragment 0:0
>
> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>         at
> org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123)
>         at
> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:422)
>         at
> org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:96)
>         at
> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:273)
>         at
> org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:243)
>         at
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>         at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>         at
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>         at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>         at
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>         at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>         at
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:312)
>         at
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:286)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>         at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>         at
> io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>         at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
>         at
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
>         at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
>         at
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
>         at
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>         at
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
>         at
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
>         at
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
>         at
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.Exception: DATA_READ ERROR: Error parsing file
>
> DATA_READ ERROR: Maximum Error Threshold Exceeded:
>
> Line: 1
> DATA_WRITE ERROR: null
>
> Could not write string:
> index: 0, length: 342 (expected: range(0, 256))
>
> [Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]
>
> [Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
> Fragment 0:0
>
> [Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
>         at
> org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:633)
>         at
> org.apache.drill.exec.store.syslog.SyslogRecordReader.next(SyslogRecordReader.java:169)
>         at
> org.apache.drill.exec.physical.impl.ScanBatch.internalNext(ScanBatch.java:223)
>         at
> org.apache.drill.exec.physical.impl.ScanBatch.next(ScanBatch.java:271)
>         at
> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:126)
>         at
> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:116)
>         at
> org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext(AbstractUnaryRecordBatch.java:63)
>         at
> org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:143)
>         at
> org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:186)
>         at
> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:104)
>         at
> org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:83)
>         at
> org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:94)
>         at org.apache.drill.exec.work
> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:297)
>         at org.apache.drill.exec.work
> .fragment.FragmentExecutor$1.run(FragmentExecutor.java:284)
>         at .......(:0)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
>         at org.apache.drill.exec.work
> .fragment.FragmentExecutor.run(FragmentExecutor.java:284)
>         at
> org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38)
>         at .......(:0)
>
>
> If you run a query w/o long fields it works fine:
> dbc:drill:zk=local> select * from dfs.test.`logs.syslog1`;
>
> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
> |        event_date        | severity_code  | facility_code  | severity  |
> facility  |           ip           | app_name  | message_id  |
>       message                     | process_id  |
>      structured_data_text                               |
>            structured_data                              |
>
> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT      |
> AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
> root' failed for lonvick on /dev/pts/8  | null        | null
>                                                              | {}
>                                                              |
> | 1985-04-12 23:20:50.52   | 2              | 4              | CRIT      |
> AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
> root' failed for lonvick on /dev/pts/8  | null        | null
>                                                              | {}
>                                                              |
> | 1985-04-12 23:20:50.52   | 2              | 4              | CRIT      |
> AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
> root' failed for lonvick on /dev/pts/8  | null        | null
>                                                              | {}
>                                                              |
> | 2003-10-11 22:14:15.003  | 2              | 4              | CRIT      |
> AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
> root' failed for lonvick on /dev/pts/8  | null        | null
>                                                              | {}
>                                                              |
> | 2003-08-24 12:14:15.0    | 2              | 4              | CRIT      |
> AUTH      | mymachine.example.com  | su        | ID47        | BOM'su
> root' failed for lonvick on /dev/pts/8  | null        | null
>                                                              | {}
>                                                              |
> | 2003-08-24 12:14:15.0    | 5              | 20             | NOTICE    |
> LOCAL4    | 192.0.2.1              | myproc    | null        | %% It's time
> to make the do-nuts.              | 8710        | null
>                                                        | {}
>                                                        |
> | 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE    |
> LOCAL4    | mymachine.example.com  | evntslog  | ID47        | null
>                                      | null        | {examplePriority@32473=[class=high],
> exampleSDID@32473=[iut=3, eventSource=Application, eventID=1011]} |
> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
> | 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE    |
> LOCAL4    | mymachine.example.com  | evntslog  | ID47        | - and
> thats a wrap!                            | null        |
> {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3,
> eventSource=Application, eventID=1011]} |
> {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
>
> +--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
> 8 rows selected (1.702 seconds)
>
>
> > On Jan 24, 2019, at 17:02, Karthikeyan Manivannan <km...@mapr.com>
> wrote:
> >
> > Hi Charles,
> >
> > Can you please provide the stack trace.
> >
> > Thanks
> >
> > Karthik
> >
> > On Tue, Jan 22, 2019 at 9:18 PM Charles Givre <cg...@gmail.com> wrote:
> >
> >> Hello all,
> >> I’m working on a format plugin to read syslog data, and have encountered
> >> what seems to be a bit of a regression (maybe).   The code below is a
> >> helper function which writes strings from the data.  As of Drill 1.16,
> the
> >> varchar holder seems to throw an error if the string you are trying to
> >> write is > 256 characters.  Is there a workaround?
> >>
> >> Thanks!
> >>
> >>
> >> //Helper function to map strings
> >> private void mapStringField(String name, String value,
> >> BaseWriter.MapWriter map) {
> >>  if (value == null) {
> >>    return;
> >>  }
> >>  try {
> >>    byte[] bytes = value.getBytes("UTF-8");
> >>    int stringLength = bytes.length;
> >>    this.buffer.setBytes(0, bytes, 0, stringLength);
> >>    map.varChar(name).writeVarChar(0, stringLength, buffer);
> >>  } catch (Exception e) {
> >>    throw UserException
> >>            .dataWriteError()
> >>            .addContext("Could not write string: ")
> >>            .addContext(e.getMessage())
> >>            .build(logger);
> >>  }
> >> }
> >>
> >>
> >>
> >>
> >>
>
>

Re: Regression? Drill Truncating Varchars

Posted by Charles Givre <cg...@gmail.com>.
Here you go… Thanks for your help!


SELECT event_date,severity_code,facility_code,severity,facility,ip,app_name,process_id,message_id,structured_data_text,structured_data_UserAgent,structured_data_UserHostAddress,structured_data_BrowserSession,structured_data_Realm,structured_data_Appliance,structured_data_Company,structured_data_UserID,structured_data_PEN,structured_data_HostName,structured_data_Category,structured_data_Priority,message FROM cp.`syslog/test.syslog1`
18:16:50.061 [main] ERROR org.apache.drill.TestReporter - Test Failed (d: 0 B(1 B), h: 21.6 MiB(130.9 MiB), nh: 2.0 MiB(82.2 MiB)): testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
java.lang.NullPointerException: null
	at org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276) ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
	at org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310) ~[test-classes/:na]
	at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
18:16:50.571 [main] ERROR org.apache.drill.TestReporter - Test Failed (d: 0 B(1 B), h: 20.5 MiB(209.9 MiB), nh: 411.1 KiB(85.4 MiB)): testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)
java.lang.NullPointerException: null
	at org.apache.drill.test.rowSet.RowSetComparison.verifyAndClearAll(RowSetComparison.java:276) ~[drill-java-exec-1.16.0-SNAPSHOT-tests.jar:1.16.0-SNAPSHOT]
	at org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248) ~[test-classes/:na]
	at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_65]
[ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 3.173 s <<< FAILURE! - in org.apache.drill.exec.store.syslog.TestSyslogFormat
[ERROR] testExplicitFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)  Time elapsed: 0.234 s  <<< ERROR!
java.lang.NullPointerException
	at org.apache.drill.exec.store.syslog.TestSyslogFormat.testExplicitFlattenedStructuredDataQuery(TestSyslogFormat.java:310)

[ERROR] testStarFlattenedStructuredDataQuery(org.apache.drill.exec.store.syslog.TestSyslogFormat)  Time elapsed: 0.125 s  <<< ERROR!
java.lang.NullPointerException
	at org.apache.drill.exec.store.syslog.TestSyslogFormat.testStarFlattenedStructuredDataQuery(TestSyslogFormat.java:248)

[INFO]
[INFO] Results:
[INFO]
[ERROR] Errors:
[ERROR]   TestSyslogFormat.testExplicitFlattenedStructuredDataQuery:310 » NullPointer
[ERROR]   TestSyslogFormat.testStarFlattenedStructuredDataQuery:248 » NullPointer
[INFO]
[ERROR] Tests run: 6, Failures: 0, Errors: 2, Skipped: 0
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  33.919 s
[INFO] Finished at: 2019-01-24T18:16:51-05:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test (default-test) on project drill-format-syslog: There are test failures.
[ERROR]
[ERROR] Please refer to /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M2:test (default-test) on project drill-format-syslog: There are test failures.

Please refer to /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports for the individual test results.
Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream.
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:215)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:497)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test failures.

Please refer to /Users/cgivre/github/drill-dev/drill/contrib/format-syslog/target/surefire-reports for the individual test results.
Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream.
    at org.apache.maven.plugin.surefire.SurefireHelper.throwException (SurefireHelper.java:271)
    at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution (SurefireHelper.java:159)
    at org.apache.maven.plugin.surefire.SurefirePlugin.handleSummary (SurefirePlugin.java:373)
    at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked (AbstractSurefireMojo.java:1016)
    at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute (AbstractSurefireMojo.java:846)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:497)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
[ERROR]
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException


If you run the query in sqline, here is the result:


jdbc:drill:zk=local> select * from dfs.test.`test.syslog1`;
Error: DATA_READ ERROR: Error parsing file

DATA_READ ERROR: Maximum Error Threshold Exceeded:

Line: 1
DATA_WRITE ERROR: null

Could not write string:
index: 0, length: 342 (expected: range(0, 256))

[Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]

[Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
Fragment 0:0

[Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010] (state=,code=0)
java.sql.SQLException: DATA_READ ERROR: Error parsing file

DATA_READ ERROR: Maximum Error Threshold Exceeded:

Line: 1
DATA_WRITE ERROR: null

Could not write string:
index: 0, length: 342 (expected: range(0, 256))

[Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]

[Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
Fragment 0:0

[Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
	at org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.java:536)
	at org.apache.drill.jdbc.impl.DrillCursor.loadInitialSchema(DrillCursor.java:608)
	at org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:1288)
	at org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:61)
	at org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
	at org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1107)
	at org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1118)
	at org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
	at org.apache.drill.jdbc.impl.DrillConnectionImpl.prepareAndExecuteInternal(DrillConnectionImpl.java:200)
	at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
	at org.apache.calcite.avatica.AvaticaStatement.execute(AvaticaStatement.java:217)
	at sqlline.Commands.execute(Commands.java:938)
	at sqlline.Commands.sql(Commands.java:882)
	at sqlline.SqlLine.dispatch(SqlLine.java:725)
	at sqlline.SqlLine.begin(SqlLine.java:540)
	at sqlline.SqlLine.start(SqlLine.java:264)
	at sqlline.SqlLine.main(SqlLine.java:195)
Caused by: org.apache.drill.common.exceptions.UserRemoteException: DATA_READ ERROR: Error parsing file

DATA_READ ERROR: Maximum Error Threshold Exceeded:

Line: 1
DATA_WRITE ERROR: null

Could not write string:
index: 0, length: 342 (expected: range(0, 256))

[Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]

[Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
Fragment 0:0

[Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
	at org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123)
	at org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:422)
	at org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:96)
	at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:273)
	at org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:243)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:312)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:286)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
	at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.Exception: DATA_READ ERROR: Error parsing file

DATA_READ ERROR: Maximum Error Threshold Exceeded:

Line: 1
DATA_WRITE ERROR: null

Could not write string:
index: 0, length: 342 (expected: range(0, 256))

[Error Id: 39ac3038-63b7-4190-bcc7-e2bdda924021 ]

[Error Id: 84807daf-ae51-4a2b-ba2a-1729facdb891 ]
Fragment 0:0

[Error Id: 6054aa6b-91df-43b5-8157-a45b4e316e7f on 192.168.1.21:31010]
	at org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:633)
	at org.apache.drill.exec.store.syslog.SyslogRecordReader.next(SyslogRecordReader.java:169)
	at org.apache.drill.exec.physical.impl.ScanBatch.internalNext(ScanBatch.java:223)
	at org.apache.drill.exec.physical.impl.ScanBatch.next(ScanBatch.java:271)
	at org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:126)
	at org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:116)
	at org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext(AbstractUnaryRecordBatch.java:63)
	at org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext(ProjectRecordBatch.java:143)
	at org.apache.drill.exec.record.AbstractRecordBatch.next(AbstractRecordBatch.java:186)
	at org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:104)
	at org.apache.drill.exec.physical.impl.ScreenCreator$ScreenRoot.innerNext(ScreenCreator.java:83)
	at org.apache.drill.exec.physical.impl.BaseRootExec.next(BaseRootExec.java:94)
	at org.apache.drill.exec.work.fragment.FragmentExecutor$1.run(FragmentExecutor.java:297)
	at org.apache.drill.exec.work.fragment.FragmentExecutor$1.run(FragmentExecutor.java:284)
	at .......(:0)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
	at org.apache.drill.exec.work.fragment.FragmentExecutor.run(FragmentExecutor.java:284)
	at org.apache.drill.common.SelfCleaningRunnable.run(SelfCleaningRunnable.java:38)
	at .......(:0)


If you run a query w/o long fields it works fine:  
dbc:drill:zk=local> select * from dfs.test.`logs.syslog1`;
+--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
|        event_date        | severity_code  | facility_code  | severity  | facility  |           ip           | app_name  | message_id  |                    message                     | process_id  |                               structured_data_text                               |                             structured_data                              |
+--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
| 2003-10-11 22:14:15.003  | 2              | 4              | CRIT      | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su root' failed for lonvick on /dev/pts/8  | null        | null                                                                             | {}                                                                       |
| 1985-04-12 23:20:50.52   | 2              | 4              | CRIT      | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su root' failed for lonvick on /dev/pts/8  | null        | null                                                                             | {}                                                                       |
| 1985-04-12 23:20:50.52   | 2              | 4              | CRIT      | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su root' failed for lonvick on /dev/pts/8  | null        | null                                                                             | {}                                                                       |
| 2003-10-11 22:14:15.003  | 2              | 4              | CRIT      | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su root' failed for lonvick on /dev/pts/8  | null        | null                                                                             | {}                                                                       |
| 2003-08-24 12:14:15.0    | 2              | 4              | CRIT      | AUTH      | mymachine.example.com  | su        | ID47        | BOM'su root' failed for lonvick on /dev/pts/8  | null        | null                                                                             | {}                                                                       |
| 2003-08-24 12:14:15.0    | 5              | 20             | NOTICE    | LOCAL4    | 192.0.2.1              | myproc    | null        | %% It's time to make the do-nuts.              | 8710        | null                                                                             | {}                                                                       |
| 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE    | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | null                                           | null        | {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3, eventSource=Application, eventID=1011]} | {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
| 2003-10-11 22:14:15.003  | 5              | 20             | NOTICE    | LOCAL4    | mymachine.example.com  | evntslog  | ID47        | - and thats a wrap!                            | null        | {examplePriority@32473=[class=high], exampleSDID@32473=[iut=3, eventSource=Application, eventID=1011]} | {"class":"high","iut":"3","eventSource":"Application","eventID":"1011"}  |
+--------------------------+----------------+----------------+-----------+-----------+------------------------+-----------+-------------+------------------------------------------------+-------------+----------------------------------------------------------------------------------+--------------------------------------------------------------------------+
8 rows selected (1.702 seconds)


> On Jan 24, 2019, at 17:02, Karthikeyan Manivannan <km...@mapr.com> wrote:
> 
> Hi Charles,
> 
> Can you please provide the stack trace.
> 
> Thanks
> 
> Karthik
> 
> On Tue, Jan 22, 2019 at 9:18 PM Charles Givre <cg...@gmail.com> wrote:
> 
>> Hello all,
>> I’m working on a format plugin to read syslog data, and have encountered
>> what seems to be a bit of a regression (maybe).   The code below is a
>> helper function which writes strings from the data.  As of Drill 1.16, the
>> varchar holder seems to throw an error if the string you are trying to
>> write is > 256 characters.  Is there a workaround?
>> 
>> Thanks!
>> 
>> 
>> //Helper function to map strings
>> private void mapStringField(String name, String value,
>> BaseWriter.MapWriter map) {
>>  if (value == null) {
>>    return;
>>  }
>>  try {
>>    byte[] bytes = value.getBytes("UTF-8");
>>    int stringLength = bytes.length;
>>    this.buffer.setBytes(0, bytes, 0, stringLength);
>>    map.varChar(name).writeVarChar(0, stringLength, buffer);
>>  } catch (Exception e) {
>>    throw UserException
>>            .dataWriteError()
>>            .addContext("Could not write string: ")
>>            .addContext(e.getMessage())
>>            .build(logger);
>>  }
>> }
>> 
>> 
>> 
>> 
>> 


Re: Regression? Drill Truncating Varchars

Posted by Karthikeyan Manivannan <km...@mapr.com>.
Hi Charles,

Can you please provide the stack trace.

Thanks

Karthik

On Tue, Jan 22, 2019 at 9:18 PM Charles Givre <cg...@gmail.com> wrote:

> Hello all,
> I’m working on a format plugin to read syslog data, and have encountered
> what seems to be a bit of a regression (maybe).   The code below is a
> helper function which writes strings from the data.  As of Drill 1.16, the
> varchar holder seems to throw an error if the string you are trying to
> write is > 256 characters.  Is there a workaround?
>
> Thanks!
>
>
> //Helper function to map strings
> private void mapStringField(String name, String value,
> BaseWriter.MapWriter map) {
>   if (value == null) {
>     return;
>   }
>   try {
>     byte[] bytes = value.getBytes("UTF-8");
>     int stringLength = bytes.length;
>     this.buffer.setBytes(0, bytes, 0, stringLength);
>     map.varChar(name).writeVarChar(0, stringLength, buffer);
>   } catch (Exception e) {
>     throw UserException
>             .dataWriteError()
>             .addContext("Could not write string: ")
>             .addContext(e.getMessage())
>             .build(logger);
>   }
> }
>
>
>
>
>