You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@beam.apache.org by Lukasz Cwik <lc...@google.com> on 2019/05/02 18:05:18 UTC

Re: kafka client interoperability

+dev <de...@beam.apache.org>

On Thu, May 2, 2019 at 10:34 AM Moorhead,Richard <
Richard.Moorhead2@cerner.com> wrote:

> In Beam 2.9.0, this check was made:
>
>
> https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132
>
> However this logic was removed in 2.10+ in the newer ProducerRecordCoder
> class:
>
>
> https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137
>
>
> We are attempting to use Beam 2.10 with kafka 0.10.2.1; this is advertised
> as supported here:
>
> https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html
>
> However we are experiencing issues with the `headers` method call
> mentioned above. Is there a way around this?
>
>
>
>
> CONFIDENTIALITY NOTICE This message and any included attachments are from
> Cerner Corporation and are intended only for the addressee. The information
> contained in this message is confidential and may constitute inside or
> non-public information under international, federal, or state securities
> laws. Unauthorized forwarding, printing, copying, distribution, or use of
> such information is strictly prohibited and may be unlawful. If you are not
> the addressee, please promptly delete this message and notify the sender of
> the delivery error by e-mail or you may call Cerner's corporate offices in
> Kansas City, Missouri, U.S.A at (+1) (816)221-1024.
>

Re: kafka client interoperability

Posted by Alexey Romanenko <ar...@gmail.com>.
Oops, I see that Richard already created a Jira about that, so I close mine as a duplicate. 

> On 3 May 2019, at 15:58, Alexey Romanenko <ar...@gmail.com> wrote:
> 
> Thank you for reporting this. 
> 
> Seems like it’s a bug there (since ProducerRecord from kafka-clients:0.10.2.1 doesn’t support headers), so I created a Jira for that:
> https://issues.apache.org/jira/browse/BEAM-7217 <https://issues.apache.org/jira/browse/BEAM-7217>
> 
> Unfortunately, I can’t reproduce it on my machine. Could you add your pom file and example of your pipeline into jira? 
> 
> As a workaround, I’d suggest to try to use kafka-clients with version >= 0.11.0.2 (if it’s possible).
> 
> 
>> On 3 May 2019, at 14:12, Moorhead,Richard <Richard.Moorhead2@cerner.com <ma...@cerner.com>> wrote:
>> 
>> We attempted a downgrade to beam-sdks-java-io-kafka 2.9 while using 2.10 for the rest and ran into issues. I still see checks to the ConsumerSpel throughout ProducerRecordCoder and I am beginning to think this is a bug.
>> 
>> From: Juan Carlos Garcia <jcgarciam@gmail.com <ma...@gmail.com>>
>> Sent: Thursday, May 2, 2019 11:10 PM
>> To: user@beam.apache.org <ma...@beam.apache.org>
>> Cc: dev
>> Subject: Re: kafka client interoperability
>>  
>> Downgrade only the KafkaIO module to the version that works for you (also excluding any transient dependency of it) that works for us.
>> 
>> JC. 
>> 
>> Lukasz Cwik <lcwik@google.com <ma...@google.com>> schrieb am Do., 2. Mai 2019, 20:05:
>> +dev <ma...@beam.apache.org> 
>> 
>> On Thu, May 2, 2019 at 10:34 AM Moorhead,Richard <Richard.Moorhead2@cerner.com <ma...@cerner.com>> wrote:
>> In Beam 2.9.0, this check was made:
>> 
>> https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132 <https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132>
>> 
>> However this logic was removed in 2.10+ in the newer ProducerRecordCoder class:
>> 
>> https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137 <https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137>
>> 
>> 
>> We are attempting to use Beam 2.10 with kafka 0.10.2.1; this is advertised as supported here:
>> https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html <https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html>
>> 
>> However we are experiencing issues with the `headers` method call mentioned above. Is there a way around this?
>> 
>> 
>> CONFIDENTIALITY NOTICE This message and any included attachments are from Cerner Corporation and are intended only for the addressee. The information contained in this message is confidential and may constitute inside or non-public information under international, federal, or state securities laws. Unauthorized forwarding, printing, copying, distribution, or use of such information is strictly prohibited and may be unlawful. If you are not the addressee, please promptly delete this message and notify the sender of the delivery error by e-mail or you may call Cerner's corporate offices in Kansas City, Missouri, U.S.A at (+1) (816)221-1024.
> 


Re: kafka client interoperability

Posted by Alexey Romanenko <ar...@gmail.com>.
Oops, I see that Richard already created a Jira about that, so I close mine as a duplicate. 

> On 3 May 2019, at 15:58, Alexey Romanenko <ar...@gmail.com> wrote:
> 
> Thank you for reporting this. 
> 
> Seems like it’s a bug there (since ProducerRecord from kafka-clients:0.10.2.1 doesn’t support headers), so I created a Jira for that:
> https://issues.apache.org/jira/browse/BEAM-7217 <https://issues.apache.org/jira/browse/BEAM-7217>
> 
> Unfortunately, I can’t reproduce it on my machine. Could you add your pom file and example of your pipeline into jira? 
> 
> As a workaround, I’d suggest to try to use kafka-clients with version >= 0.11.0.2 (if it’s possible).
> 
> 
>> On 3 May 2019, at 14:12, Moorhead,Richard <Richard.Moorhead2@cerner.com <ma...@cerner.com>> wrote:
>> 
>> We attempted a downgrade to beam-sdks-java-io-kafka 2.9 while using 2.10 for the rest and ran into issues. I still see checks to the ConsumerSpel throughout ProducerRecordCoder and I am beginning to think this is a bug.
>> 
>> From: Juan Carlos Garcia <jcgarciam@gmail.com <ma...@gmail.com>>
>> Sent: Thursday, May 2, 2019 11:10 PM
>> To: user@beam.apache.org <ma...@beam.apache.org>
>> Cc: dev
>> Subject: Re: kafka client interoperability
>>  
>> Downgrade only the KafkaIO module to the version that works for you (also excluding any transient dependency of it) that works for us.
>> 
>> JC. 
>> 
>> Lukasz Cwik <lcwik@google.com <ma...@google.com>> schrieb am Do., 2. Mai 2019, 20:05:
>> +dev <ma...@beam.apache.org> 
>> 
>> On Thu, May 2, 2019 at 10:34 AM Moorhead,Richard <Richard.Moorhead2@cerner.com <ma...@cerner.com>> wrote:
>> In Beam 2.9.0, this check was made:
>> 
>> https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132 <https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132>
>> 
>> However this logic was removed in 2.10+ in the newer ProducerRecordCoder class:
>> 
>> https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137 <https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137>
>> 
>> 
>> We are attempting to use Beam 2.10 with kafka 0.10.2.1; this is advertised as supported here:
>> https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html <https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html>
>> 
>> However we are experiencing issues with the `headers` method call mentioned above. Is there a way around this?
>> 
>> 
>> CONFIDENTIALITY NOTICE This message and any included attachments are from Cerner Corporation and are intended only for the addressee. The information contained in this message is confidential and may constitute inside or non-public information under international, federal, or state securities laws. Unauthorized forwarding, printing, copying, distribution, or use of such information is strictly prohibited and may be unlawful. If you are not the addressee, please promptly delete this message and notify the sender of the delivery error by e-mail or you may call Cerner's corporate offices in Kansas City, Missouri, U.S.A at (+1) (816)221-1024.
> 


Re: kafka client interoperability

Posted by Alexey Romanenko <ar...@gmail.com>.
Thank you for reporting this. 

Seems like it’s a bug there (since ProducerRecord from kafka-clients:0.10.2.1 doesn’t support headers), so I created a Jira for that:
https://issues.apache.org/jira/browse/BEAM-7217 <https://issues.apache.org/jira/browse/BEAM-7217>

Unfortunately, I can’t reproduce it on my machine. Could you add your pom file and example of your pipeline into jira? 

As a workaround, I’d suggest to try to use kafka-clients with version >= 0.11.0.2 (if it’s possible).


> On 3 May 2019, at 14:12, Moorhead,Richard <Ri...@cerner.com> wrote:
> 
> We attempted a downgrade to beam-sdks-java-io-kafka 2.9 while using 2.10 for the rest and ran into issues. I still see checks to the ConsumerSpel throughout ProducerRecordCoder and I am beginning to think this is a bug.
> 
> From: Juan Carlos Garcia <jc...@gmail.com>
> Sent: Thursday, May 2, 2019 11:10 PM
> To: user@beam.apache.org
> Cc: dev
> Subject: Re: kafka client interoperability
>  
> Downgrade only the KafkaIO module to the version that works for you (also excluding any transient dependency of it) that works for us.
> 
> JC. 
> 
> Lukasz Cwik <lcwik@google.com <ma...@google.com>> schrieb am Do., 2. Mai 2019, 20:05:
> +dev <ma...@beam.apache.org> 
> 
> On Thu, May 2, 2019 at 10:34 AM Moorhead,Richard <Richard.Moorhead2@cerner.com <ma...@cerner.com>> wrote:
> In Beam 2.9.0, this check was made:
> 
> https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132 <https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132>
> 
> However this logic was removed in 2.10+ in the newer ProducerRecordCoder class:
> 
> https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137 <https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137>
> 
> 
> We are attempting to use Beam 2.10 with kafka 0.10.2.1; this is advertised as supported here:
> https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html <https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html>
> 
> However we are experiencing issues with the `headers` method call mentioned above. Is there a way around this?
> 
> 
> CONFIDENTIALITY NOTICE This message and any included attachments are from Cerner Corporation and are intended only for the addressee. The information contained in this message is confidential and may constitute inside or non-public information under international, federal, or state securities laws. Unauthorized forwarding, printing, copying, distribution, or use of such information is strictly prohibited and may be unlawful. If you are not the addressee, please promptly delete this message and notify the sender of the delivery error by e-mail or you may call Cerner's corporate offices in Kansas City, Missouri, U.S.A at (+1) (816)221-1024.


Re: kafka client interoperability

Posted by Alexey Romanenko <ar...@gmail.com>.
Thank you for reporting this. 

Seems like it’s a bug there (since ProducerRecord from kafka-clients:0.10.2.1 doesn’t support headers), so I created a Jira for that:
https://issues.apache.org/jira/browse/BEAM-7217 <https://issues.apache.org/jira/browse/BEAM-7217>

Unfortunately, I can’t reproduce it on my machine. Could you add your pom file and example of your pipeline into jira? 

As a workaround, I’d suggest to try to use kafka-clients with version >= 0.11.0.2 (if it’s possible).


> On 3 May 2019, at 14:12, Moorhead,Richard <Ri...@cerner.com> wrote:
> 
> We attempted a downgrade to beam-sdks-java-io-kafka 2.9 while using 2.10 for the rest and ran into issues. I still see checks to the ConsumerSpel throughout ProducerRecordCoder and I am beginning to think this is a bug.
> 
> From: Juan Carlos Garcia <jc...@gmail.com>
> Sent: Thursday, May 2, 2019 11:10 PM
> To: user@beam.apache.org
> Cc: dev
> Subject: Re: kafka client interoperability
>  
> Downgrade only the KafkaIO module to the version that works for you (also excluding any transient dependency of it) that works for us.
> 
> JC. 
> 
> Lukasz Cwik <lcwik@google.com <ma...@google.com>> schrieb am Do., 2. Mai 2019, 20:05:
> +dev <ma...@beam.apache.org> 
> 
> On Thu, May 2, 2019 at 10:34 AM Moorhead,Richard <Richard.Moorhead2@cerner.com <ma...@cerner.com>> wrote:
> In Beam 2.9.0, this check was made:
> 
> https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132 <https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132>
> 
> However this logic was removed in 2.10+ in the newer ProducerRecordCoder class:
> 
> https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137 <https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137>
> 
> 
> We are attempting to use Beam 2.10 with kafka 0.10.2.1; this is advertised as supported here:
> https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html <https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html>
> 
> However we are experiencing issues with the `headers` method call mentioned above. Is there a way around this?
> 
> 
> CONFIDENTIALITY NOTICE This message and any included attachments are from Cerner Corporation and are intended only for the addressee. The information contained in this message is confidential and may constitute inside or non-public information under international, federal, or state securities laws. Unauthorized forwarding, printing, copying, distribution, or use of such information is strictly prohibited and may be unlawful. If you are not the addressee, please promptly delete this message and notify the sender of the delivery error by e-mail or you may call Cerner's corporate offices in Kansas City, Missouri, U.S.A at (+1) (816)221-1024.


Re: kafka client interoperability

Posted by "Moorhead,Richard" <Ri...@cerner.com>.
We attempted a downgrade to beam-sdks-java-io-kafka 2.9 while using 2.10 for the rest and ran into issues. I still see checks to the ConsumerSpel throughout ProducerRecordCoder and I am beginning to think this is a bug.

________________________________
From: Juan Carlos Garcia <jc...@gmail.com>
Sent: Thursday, May 2, 2019 11:10 PM
To: user@beam.apache.org
Cc: dev
Subject: Re: kafka client interoperability

Downgrade only the KafkaIO module to the version that works for you (also excluding any transient dependency of it) that works for us.

JC.

Lukasz Cwik <lc...@google.com>> schrieb am Do., 2. Mai 2019, 20:05:
+dev<ma...@beam.apache.org>

On Thu, May 2, 2019 at 10:34 AM Moorhead,Richard <Ri...@cerner.com>> wrote:
In Beam 2.9.0, this check was made:

https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132

However this logic was removed in 2.10+ in the newer ProducerRecordCoder class:

https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137


We are attempting to use Beam 2.10 with kafka 0.10.2.1; this is advertised as supported here:
https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html

However we are experiencing issues with the `headers` method call mentioned above. Is there a way around this?




CONFIDENTIALITY NOTICE This message and any included attachments are from Cerner Corporation and are intended only for the addressee. The information contained in this message is confidential and may constitute inside or non-public information under international, federal, or state securities laws. Unauthorized forwarding, printing, copying, distribution, or use of such information is strictly prohibited and may be unlawful. If you are not the addressee, please promptly delete this message and notify the sender of the delivery error by e-mail or you may call Cerner's corporate offices in Kansas City, Missouri, U.S.A at (+1) (816)221-1024.

Re: kafka client interoperability

Posted by "Moorhead,Richard" <Ri...@cerner.com>.
We attempted a downgrade to beam-sdks-java-io-kafka 2.9 while using 2.10 for the rest and ran into issues. I still see checks to the ConsumerSpel throughout ProducerRecordCoder and I am beginning to think this is a bug.

________________________________
From: Juan Carlos Garcia <jc...@gmail.com>
Sent: Thursday, May 2, 2019 11:10 PM
To: user@beam.apache.org
Cc: dev
Subject: Re: kafka client interoperability

Downgrade only the KafkaIO module to the version that works for you (also excluding any transient dependency of it) that works for us.

JC.

Lukasz Cwik <lc...@google.com>> schrieb am Do., 2. Mai 2019, 20:05:
+dev<ma...@beam.apache.org>

On Thu, May 2, 2019 at 10:34 AM Moorhead,Richard <Ri...@cerner.com>> wrote:
In Beam 2.9.0, this check was made:

https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132

However this logic was removed in 2.10+ in the newer ProducerRecordCoder class:

https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137


We are attempting to use Beam 2.10 with kafka 0.10.2.1; this is advertised as supported here:
https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html

However we are experiencing issues with the `headers` method call mentioned above. Is there a way around this?




CONFIDENTIALITY NOTICE This message and any included attachments are from Cerner Corporation and are intended only for the addressee. The information contained in this message is confidential and may constitute inside or non-public information under international, federal, or state securities laws. Unauthorized forwarding, printing, copying, distribution, or use of such information is strictly prohibited and may be unlawful. If you are not the addressee, please promptly delete this message and notify the sender of the delivery error by e-mail or you may call Cerner's corporate offices in Kansas City, Missouri, U.S.A at (+1) (816)221-1024.

Re: kafka client interoperability

Posted by Juan Carlos Garcia <jc...@gmail.com>.
Downgrade only the KafkaIO module to the version that works for you (also
excluding any transient dependency of it) that works for us.

JC.

Lukasz Cwik <lc...@google.com> schrieb am Do., 2. Mai 2019, 20:05:

> +dev <de...@beam.apache.org>
>
> On Thu, May 2, 2019 at 10:34 AM Moorhead,Richard <
> Richard.Moorhead2@cerner.com> wrote:
>
>> In Beam 2.9.0, this check was made:
>>
>>
>> https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132
>>
>> However this logic was removed in 2.10+ in the newer ProducerRecordCoder
>> class:
>>
>>
>> https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137
>>
>>
>> We are attempting to use Beam 2.10 with kafka 0.10.2.1; this is
>> advertised as supported here:
>>
>> https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html
>>
>> However we are experiencing issues with the `headers` method call
>> mentioned above. Is there a way around this?
>>
>>
>>
>>
>> CONFIDENTIALITY NOTICE This message and any included attachments are from
>> Cerner Corporation and are intended only for the addressee. The information
>> contained in this message is confidential and may constitute inside or
>> non-public information under international, federal, or state securities
>> laws. Unauthorized forwarding, printing, copying, distribution, or use of
>> such information is strictly prohibited and may be unlawful. If you are not
>> the addressee, please promptly delete this message and notify the sender of
>> the delivery error by e-mail or you may call Cerner's corporate offices in
>> Kansas City, Missouri, U.S.A at (+1) (816)221-1024.
>>
>

Re: kafka client interoperability

Posted by Juan Carlos Garcia <jc...@gmail.com>.
Downgrade only the KafkaIO module to the version that works for you (also
excluding any transient dependency of it) that works for us.

JC.

Lukasz Cwik <lc...@google.com> schrieb am Do., 2. Mai 2019, 20:05:

> +dev <de...@beam.apache.org>
>
> On Thu, May 2, 2019 at 10:34 AM Moorhead,Richard <
> Richard.Moorhead2@cerner.com> wrote:
>
>> In Beam 2.9.0, this check was made:
>>
>>
>> https://github.com/apache/beam/blob/2ba00576e3a708bb961a3c64a2241d9ab32ab5b3/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/KafkaRecordCoder.java#L132
>>
>> However this logic was removed in 2.10+ in the newer ProducerRecordCoder
>> class:
>>
>>
>> https://github.com/apache/beam/blob/release-2.10.0/sdks/java/io/kafka/src/main/java/org/apache/beam/sdk/io/kafka/ProducerRecordCoder.java#L137
>>
>>
>> We are attempting to use Beam 2.10 with kafka 0.10.2.1; this is
>> advertised as supported here:
>>
>> https://beam.apache.org/releases/javadoc/2.10.0/org/apache/beam/sdk/io/kafka/KafkaIO.html
>>
>> However we are experiencing issues with the `headers` method call
>> mentioned above. Is there a way around this?
>>
>>
>>
>>
>> CONFIDENTIALITY NOTICE This message and any included attachments are from
>> Cerner Corporation and are intended only for the addressee. The information
>> contained in this message is confidential and may constitute inside or
>> non-public information under international, federal, or state securities
>> laws. Unauthorized forwarding, printing, copying, distribution, or use of
>> such information is strictly prohibited and may be unlawful. If you are not
>> the addressee, please promptly delete this message and notify the sender of
>> the delivery error by e-mail or you may call Cerner's corporate offices in
>> Kansas City, Missouri, U.S.A at (+1) (816)221-1024.
>>
>