You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@nifi.apache.org by "Mark Payne (JIRA)" <ji...@apache.org> on 2017/12/08 14:17:00 UTC

[jira] [Updated] (NIFI-4639) PublishKafkaRecord with Avro writer: schema lost from output

     [ https://issues.apache.org/jira/browse/NIFI-4639?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Mark Payne updated NIFI-4639:
-----------------------------
    Status: Patch Available  (was: Open)

> PublishKafkaRecord with Avro writer: schema lost from output
> ------------------------------------------------------------
>
>                 Key: NIFI-4639
>                 URL: https://issues.apache.org/jira/browse/NIFI-4639
>             Project: Apache NiFi
>          Issue Type: Bug
>          Components: Extensions
>    Affects Versions: 1.4.0
>            Reporter: Matthew Silverman
>         Attachments: Demo_Names_NiFi_bug.xml
>
>
> I have a {{PublishKafkaRecord_0_10}} configured with an {{AvroRecordSetWriter}}, in turn configured to "Embed Avro Schema".  However, when I consume data from the Kafka stream I recieve individual records that lack a schema header.
> As a workaround, I can send the flow files through a {{SplitRecord}} processor, which does embed the Avro schema into each resulting flow file.
> Comparing the code for {{SplitRecord}} and the {{PublishKafkaRecord}} processors, I believe the issue is that {{PublisherLease}} wipes the output stream after calling {{createWriter}}; however it is {{AvroRecordSetWriter#createWriter}} that writes the Avro header to the output stream.  {{SplitRecord}}, on the other hand, creates a new writer for each output record.
> I've attached my flow.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)