You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@nifi.apache.org by "sumanth chinthagunta (JIRA)" <ji...@apache.org> on 2016/07/18 03:44:20 UTC

[jira] [Updated] (NIFI-2298) Add missing futures for ConsumeKafka

     [ https://issues.apache.org/jira/browse/NIFI-2298?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

sumanth chinthagunta updated NIFI-2298:
---------------------------------------
    Description: 
The new ConsumeKafka processor  is missing some capabilities that were present in old getKafka processor. 
1. New ConsumeKafka is not writing critical Kafka attributes  i.e., kafka.key, kafka.offset, kafka.partition etc into flowFile attributes. 

Old getKafka processor: 
{quote}
Standard FlowFile Attributes
Key: 'entryDate'
               Value: 'Sun Jul 17 15:17:00 CDT 2016'
Key: 'lineageStartDate'
               Value: 'Sun Jul 17 15:17:00 CDT 2016'
Key: 'fileSize'
               Value: '183'
FlowFile Attribute Map Content
Key: 'filename'
               Value: '19709945781167274'
Key: 'kafka.key'
               Value: '\{"database":"test","table":"sc_job","pk.systemid":1\}'
Key: 'kafka.offset'
               Value: '1184010261'
Key: 'kafka.partition'
               Value: '0'
Key: 'kafka.topic'
               Value: ‘data'
Key: 'path'
               Value: './'
Key: 'uuid'
               Value: '244059bb-9ad9-4d74-b1fb-312eee72124a'
 {quote}
 
New ConsumeKafka processor : 
 {quote}
Standard FlowFile Attributes
Key: 'entryDate'
               Value: 'Sun Jul 17 15:18:41 CDT 2016'
Key: 'lineageStartDate'
               Value: 'Sun Jul 17 15:18:41 CDT 2016'
Key: 'fileSize'
               Value: '183'
FlowFile Attribute Map Content
Key: 'filename'
               Value: '19710046870478139'
Key: 'path'
               Value: './'
Key: 'uuid'
               Value: '349fbeb3-e342-4533-be4c-424793fa5c59’
{quote}

2. getKafka/petKafka are compatible with Kafka 0.8.x and 0.9.x . 
Please make new PublishKafka/ConsumeKafka processors based on Kafka 0.10 version. 

3. Support subscribing to multiple topics i.e., topic:  topic1,topic2 

  was:
The new ConsumeKafka processor  is missing some capabilities that were in old getKafka processor. 
1. New ConsumeKafka is not writing critical Kafka attributes  i.e., kafka.key, kafka.offset, kafka.partition etc into flowFile attributes. 

Old getKafka processor: 
{quote}
Standard FlowFile Attributes
Key: 'entryDate'
               Value: 'Sun Jul 17 15:17:00 CDT 2016'
Key: 'lineageStartDate'
               Value: 'Sun Jul 17 15:17:00 CDT 2016'
Key: 'fileSize'
               Value: '183'
FlowFile Attribute Map Content
Key: 'filename'
               Value: '19709945781167274'
Key: 'kafka.key'
               Value: '\{"database":"test","table":"sc_job","pk.systemid":1\}'
Key: 'kafka.offset'
               Value: '1184010261'
Key: 'kafka.partition'
               Value: '0'
Key: 'kafka.topic'
               Value: ‘data'
Key: 'path'
               Value: './'
Key: 'uuid'
               Value: '244059bb-9ad9-4d74-b1fb-312eee72124a'
 {quote}
 
New ConsumeKafka processor : 
 {quote}
Standard FlowFile Attributes
Key: 'entryDate'
               Value: 'Sun Jul 17 15:18:41 CDT 2016'
Key: 'lineageStartDate'
               Value: 'Sun Jul 17 15:18:41 CDT 2016'
Key: 'fileSize'
               Value: '183'
FlowFile Attribute Map Content
Key: 'filename'
               Value: '19710046870478139'
Key: 'path'
               Value: './'
Key: 'uuid'
               Value: '349fbeb3-e342-4533-be4c-424793fa5c59’
{quote}

2. getKafka/petKafka are compatible with Kafka 0.8.x and 0.9.x . Please make new PublishKafka/ConsumeKafka processors based on kafka 0.10 version. 

3. Support subscribing to multiple topics i.e., topic:  topic1,topic2 


> Add missing futures for ConsumeKafka
> ------------------------------------
>
>                 Key: NIFI-2298
>                 URL: https://issues.apache.org/jira/browse/NIFI-2298
>             Project: Apache NiFi
>          Issue Type: Bug
>          Components: Extensions
>    Affects Versions: 0.7.0
>            Reporter: sumanth chinthagunta
>              Labels: kafka
>             Fix For: 0.8.0
>
>
> The new ConsumeKafka processor  is missing some capabilities that were present in old getKafka processor. 
> 1. New ConsumeKafka is not writing critical Kafka attributes  i.e., kafka.key, kafka.offset, kafka.partition etc into flowFile attributes. 
> Old getKafka processor: 
> {quote}
> Standard FlowFile Attributes
> Key: 'entryDate'
>                Value: 'Sun Jul 17 15:17:00 CDT 2016'
> Key: 'lineageStartDate'
>                Value: 'Sun Jul 17 15:17:00 CDT 2016'
> Key: 'fileSize'
>                Value: '183'
> FlowFile Attribute Map Content
> Key: 'filename'
>                Value: '19709945781167274'
> Key: 'kafka.key'
>                Value: '\{"database":"test","table":"sc_job","pk.systemid":1\}'
> Key: 'kafka.offset'
>                Value: '1184010261'
> Key: 'kafka.partition'
>                Value: '0'
> Key: 'kafka.topic'
>                Value: ‘data'
> Key: 'path'
>                Value: './'
> Key: 'uuid'
>                Value: '244059bb-9ad9-4d74-b1fb-312eee72124a'
>  {quote}
>  
> New ConsumeKafka processor : 
>  {quote}
> Standard FlowFile Attributes
> Key: 'entryDate'
>                Value: 'Sun Jul 17 15:18:41 CDT 2016'
> Key: 'lineageStartDate'
>                Value: 'Sun Jul 17 15:18:41 CDT 2016'
> Key: 'fileSize'
>                Value: '183'
> FlowFile Attribute Map Content
> Key: 'filename'
>                Value: '19710046870478139'
> Key: 'path'
>                Value: './'
> Key: 'uuid'
>                Value: '349fbeb3-e342-4533-be4c-424793fa5c59’
> {quote}
> 2. getKafka/petKafka are compatible with Kafka 0.8.x and 0.9.x . 
> Please make new PublishKafka/ConsumeKafka processors based on Kafka 0.10 version. 
> 3. Support subscribing to multiple topics i.e., topic:  topic1,topic2 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)