You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@nifi.apache.org by "Matt Burgess (JIRA)" <ji...@apache.org> on 2019/06/12 20:10:00 UTC
[jira] [Updated] (NIFI-6295) NiFiRecordSerDe in
PutHive3StreamingProcessor does not handle types contained in arrays
[ https://issues.apache.org/jira/browse/NIFI-6295?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Matt Burgess updated NIFI-6295:
-------------------------------
Affects Version/s: (was: 1.9.2)
Status: Patch Available (was: Open)
> NiFiRecordSerDe in PutHive3StreamingProcessor does not handle types contained in arrays
> ---------------------------------------------------------------------------------------
>
> Key: NIFI-6295
> URL: https://issues.apache.org/jira/browse/NIFI-6295
> Project: Apache NiFi
> Issue Type: Bug
> Components: Extensions
> Environment: - Rhel 7.5
> - JDK 8
> Reporter: Gideon Korir
> Priority: Major
>
> NiFiRecordSerDe does not handle objects contained in Arrays correctly, causing the HiveStreaming writer to fail when trying to flush the contents.
>
> The culprit:
> {code:java}
> //NiFiRecordSerDe.extractCurrentField
> case LIST:
> val = Arrays.asList(record.getAsArray(fieldName));
> break;{code}
> Unhelpful error, the user gets:
>
> {code:java}
> java.lang.NullPointerException: null at java.lang.System.arraycopy(Native Method) at org.apache.hadoop.io.Text.set(Text.java:225) at org.apache.orc.impl.StringRedBlackTree.add(StringRedBlackTree.java:59) at org.apache.orc.impl.writer.StringTreeWriter.writeBatch(StringTreeWriter.java:70) at org.apache.orc.impl.writer.StructTreeWriter.writeFields(StructTreeWriter.java:64) at org.apache.orc.impl.writer.StructTreeWriter.writeBatch(StructTreeWriter.java:78) at org.apache.orc.impl.writer.StructTreeWriter.writeRootBatch(StructTreeWriter.java:56) at org.apache.orc.impl.WriterImpl.addRowBatch(WriterImpl.java:556) at org.apache.hadoop.hive.ql.io.orc.WriterImpl.flushInternalBatch(WriterImpl.java:297) at org.apache.hadoop.hive.ql.io.orc.WriterImpl.close(WriterImpl.java:334) at org.apache.hadoop.hive.ql.io.orc.OrcRecordUpdater.close(OrcRecordUpdater.java:557) at org.apache.hive.streaming.AbstractRecordWriter.close(AbstractRecordWriter.java:360) at org.apache.hive.streaming.HiveStreamingConnection$TransactionBatch.closeImpl(HiveStreamingConnection.java:979) at org.apache.hive.streaming.HiveStreamingConnection$TransactionBatch.close(HiveStreamingConnection.java:970) at org.apache.hive.streaming.HiveStreamingConnection$TransactionBatch.markDead(HiveStreamingConnection.java:833) at org.apache.hive.streaming.HiveStreamingConnection$TransactionBatch.write(HiveStreamingConnection.java:814) at org.apache.hive.streaming.HiveStreamingConnection.write(HiveStreamingConnection.java:533) at org.apache.nifi.processors.hive.PutHive3Streaming.onTrigger(PutHive3Streaming.java:414) at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27) at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1162) at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:205) at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)