You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "sagarcasual ." <sa...@gmail.com> on 2016/09/22 18:37:23 UTC

Error while Spark 1.6.1 streaming from Kafka-2.11_0.10.0.1 cluster

Hello,

I am trying to stream data out of kafka cluster (2.11_0.10.0.1) using Spark
1.6.1
I am receiving following error, and I confirmed that Topic to which I am
trying to connect exists with the data .

Any idea what could be the case?

kafka.common.UnknownTopicOrPartitionException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at kafka.common.ErrorMapping$.exceptionFor(ErrorMapping.scala:102)
at
org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.handleFetchErr(KafkaRDD.scala:184)
at
org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.fetchBatch(KafkaRDD.scala:193)
at
org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.getNext(KafkaRDD.scala:208)
at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)
at
scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:29)

Re: Error while Spark 1.6.1 streaming from Kafka-2.11_0.10.0.1 cluster

Posted by "sagarcasual ." <sa...@gmail.com>.
Hi, Thanks for the response,
The issue I am facing is only for the clustered Kafka 2.11 based version
0.10.0.1 and Spark 1.6.1 with following dependencies.
org.apache.spark:spark-core_2.10:1.6.1
compile group: 'org.apache.spark', name: 'spark-streaming_2.10',
version:'1.6.1'
compile group: 'org.apache.spark', name: 'spark-streaming-kafka_2.10',
version:'1.6.1'

For standalone kafka with rest versions remaining same, I am perfectly
able to connect and consume from Kafka using Spark Streaming.

DO you know if we have to do anything special while doing
createDirectStream/RDD in Clustered 0.10 Kafka consumed via Spark
1.6.1?

-Regards

Sagar



On Fri, Sep 23, 2016 at 7:05 AM, Cody Koeninger <co...@koeninger.org> wrote:

> For Spark 2.0 there are two kafka artifacts,
> spark-streaming-kafka-0-10 (0.10 and later brokers only) and
> spark-streaming-kafka-0-8 (should work with 0.8 and later brokers).
> The docs explaining this were merged to master just after 2.0
> released, so they haven't been published yet.
>
> There are usage examples of the 0-10 connector at
> https://github.com/koeninger/kafka-exactly-once
>
> The change to Spark 2.0 was really straightforward for the one or two
> jobs I switched over, for what it's worth.
>
> On Fri, Sep 23, 2016 at 12:31 AM, sagarcasual . <sa...@gmail.com>
> wrote:
> > Also you mentioned about streaming-kafka-0-10 connector, what connector
> is
> > this, do you know the dependency ? I did not see mention of it in the
> > documents
> > For current Spark 1.6.1 to Kafka 0.10.0.1 standalone, the only
> dependencies
> > I have are
> >
> > org.apache.spark:spark-core_2.10:1.6.1
> > compile group: 'org.apache.spark', name: 'spark-streaming_2.10',
> > version:'1.6.1'
> > compile group: 'org.apache.spark', name: 'spark-streaming-kafka_2.10',
> > version:'1.6.1'
> > compile group: 'org.apache.spark', name: 'spark-sql_2.10', version:
> '1.6.1'
> >
> > For Spark 2.0 with Kafka 0.10.0.1 do I need to have a different kafka
> > connector dependency?
> >
> >
> > On Thu, Sep 22, 2016 at 2:21 PM, sagarcasual . <sa...@gmail.com>
> > wrote:
> >>
> >> Hi Cody,
> >> Thanks for the response.
> >> One thing I forgot to mention is I am using a Direct Approach (No
> >> receivers) in Spark streaming.
> >>
> >> I am not sure if I have that leverage to upgrade at this point, but do
> you
> >> know if Spark 1.6.1 to Spark 2.0 jump is smooth usually or does it
> involve
> >> lot of hick-ups.
> >> Also is there a migration guide or something?
> >>
> >> -Regards
> >> Sagar
> >>
> >> On Thu, Sep 22, 2016 at 1:39 PM, Cody Koeninger <co...@koeninger.org>
> >> wrote:
> >>>
> >>> Do you have the ability to try using Spark 2.0 with the
> >>> streaming-kafka-0-10 connector?
> >>>
> >>> I'd expect the 1.6.1 version to be compatible with kafka 0.10, but it
> >>> would be good to rule that out.
> >>>
> >>> On Thu, Sep 22, 2016 at 1:37 PM, sagarcasual . <sa...@gmail.com>
> >>> wrote:
> >>> > Hello,
> >>> >
> >>> > I am trying to stream data out of kafka cluster (2.11_0.10.0.1) using
> >>> > Spark
> >>> > 1.6.1
> >>> > I am receiving following error, and I confirmed that Topic to which I
> >>> > am
> >>> > trying to connect exists with the data .
> >>> >
> >>> > Any idea what could be the case?
> >>> >
> >>> > kafka.common.UnknownTopicOrPartitionException
> >>> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> >>> > Method)
> >>> > at
> >>> >
> >>> > sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:62)
> >>> > at
> >>> >
> >>> > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
> >>> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> >>> > at java.lang.Class.newInstance(Class.java:442)
> >>> > at kafka.common.ErrorMapping$.exceptionFor(ErrorMapping.scala:102)
> >>> > at
> >>> >
> >>> > org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.
> handleFetchErr(KafkaRDD.scala:184)
> >>> > at
> >>> >
> >>> > org.apache.spark.streaming.kafka.KafkaRDD$
> KafkaRDDIterator.fetchBatch(KafkaRDD.scala:193)
> >>> > at
> >>> >
> >>> > org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.getNext(
> KafkaRDD.scala:208)
> >>> > at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)
> >>> > at
> >>> >
> >>> > scala.collection.convert.Wrappers$IteratorWrapper.
> hasNext(Wrappers.scala:29)
> >>> >
> >>> >
> >>
> >>
> >
> >
>

Re: Error while Spark 1.6.1 streaming from Kafka-2.11_0.10.0.1 cluster

Posted by Cody Koeninger <co...@koeninger.org>.
For Spark 2.0 there are two kafka artifacts,
spark-streaming-kafka-0-10 (0.10 and later brokers only) and
spark-streaming-kafka-0-8 (should work with 0.8 and later brokers).
The docs explaining this were merged to master just after 2.0
released, so they haven't been published yet.

There are usage examples of the 0-10 connector at
https://github.com/koeninger/kafka-exactly-once

The change to Spark 2.0 was really straightforward for the one or two
jobs I switched over, for what it's worth.

On Fri, Sep 23, 2016 at 12:31 AM, sagarcasual . <sa...@gmail.com> wrote:
> Also you mentioned about streaming-kafka-0-10 connector, what connector is
> this, do you know the dependency ? I did not see mention of it in the
> documents
> For current Spark 1.6.1 to Kafka 0.10.0.1 standalone, the only dependencies
> I have are
>
> org.apache.spark:spark-core_2.10:1.6.1
> compile group: 'org.apache.spark', name: 'spark-streaming_2.10',
> version:'1.6.1'
> compile group: 'org.apache.spark', name: 'spark-streaming-kafka_2.10',
> version:'1.6.1'
> compile group: 'org.apache.spark', name: 'spark-sql_2.10', version: '1.6.1'
>
> For Spark 2.0 with Kafka 0.10.0.1 do I need to have a different kafka
> connector dependency?
>
>
> On Thu, Sep 22, 2016 at 2:21 PM, sagarcasual . <sa...@gmail.com>
> wrote:
>>
>> Hi Cody,
>> Thanks for the response.
>> One thing I forgot to mention is I am using a Direct Approach (No
>> receivers) in Spark streaming.
>>
>> I am not sure if I have that leverage to upgrade at this point, but do you
>> know if Spark 1.6.1 to Spark 2.0 jump is smooth usually or does it involve
>> lot of hick-ups.
>> Also is there a migration guide or something?
>>
>> -Regards
>> Sagar
>>
>> On Thu, Sep 22, 2016 at 1:39 PM, Cody Koeninger <co...@koeninger.org>
>> wrote:
>>>
>>> Do you have the ability to try using Spark 2.0 with the
>>> streaming-kafka-0-10 connector?
>>>
>>> I'd expect the 1.6.1 version to be compatible with kafka 0.10, but it
>>> would be good to rule that out.
>>>
>>> On Thu, Sep 22, 2016 at 1:37 PM, sagarcasual . <sa...@gmail.com>
>>> wrote:
>>> > Hello,
>>> >
>>> > I am trying to stream data out of kafka cluster (2.11_0.10.0.1) using
>>> > Spark
>>> > 1.6.1
>>> > I am receiving following error, and I confirmed that Topic to which I
>>> > am
>>> > trying to connect exists with the data .
>>> >
>>> > Any idea what could be the case?
>>> >
>>> > kafka.common.UnknownTopicOrPartitionException
>>> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> > Method)
>>> > at
>>> >
>>> > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>> > at
>>> >
>>> > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>> > at java.lang.Class.newInstance(Class.java:442)
>>> > at kafka.common.ErrorMapping$.exceptionFor(ErrorMapping.scala:102)
>>> > at
>>> >
>>> > org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.handleFetchErr(KafkaRDD.scala:184)
>>> > at
>>> >
>>> > org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.fetchBatch(KafkaRDD.scala:193)
>>> > at
>>> >
>>> > org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.getNext(KafkaRDD.scala:208)
>>> > at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)
>>> > at
>>> >
>>> > scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:29)
>>> >
>>> >
>>
>>
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Fwd: Error while Spark 1.6.1 streaming from Kafka-2.11_0.10.0.1 cluster

Posted by "sagarcasual ." <sa...@gmail.com>.
Also you mentioned about streaming-kafka-0-10 connector, what connector is
this, do you know the dependency ? I did not see mention of it in the
documents
For current Spark 1.6.1 to Kafka 0.10.0.1 standalone, the only dependencies
I have are

org.apache.spark:spark-core_2.10:1.6.1
compile group: 'org.apache.spark', name: 'spark-streaming_2.10', version:'1.6.1'
compile group: 'org.apache.spark', name: 'spark-streaming-kafka_2.10',
version:'1.6.1'
compile group: 'org.apache.spark', name: 'spark-sql_2.10', version: '1.6.1'

For Spark 2.0 with Kafka 0.10.0.1 do I need to have a different kafka
connector dependency?


On Thu, Sep 22, 2016 at 2:21 PM, sagarcasual . <sa...@gmail.com>
wrote:

> Hi Cody,
> Thanks for the response.
> One thing I forgot to mention is I am using a Direct Approach (No
> receivers) in Spark streaming.
>
> I am not sure if I have that leverage to upgrade at this point, but do you
> know if Spark 1.6.1 to Spark 2.0 jump is smooth usually or does it involve
> lot of hick-ups.
> Also is there a migration guide or something?
>
> -Regards
> Sagar
>
> On Thu, Sep 22, 2016 at 1:39 PM, Cody Koeninger <co...@koeninger.org>
> wrote:
>
>> Do you have the ability to try using Spark 2.0 with the
>> streaming-kafka-0-10 connector?
>>
>> I'd expect the 1.6.1 version to be compatible with kafka 0.10, but it
>> would be good to rule that out.
>>
>> On Thu, Sep 22, 2016 at 1:37 PM, sagarcasual . <sa...@gmail.com>
>> wrote:
>> > Hello,
>> >
>> > I am trying to stream data out of kafka cluster (2.11_0.10.0.1) using
>> Spark
>> > 1.6.1
>> > I am receiving following error, and I confirmed that Topic to which I am
>> > trying to connect exists with the data .
>> >
>> > Any idea what could be the case?
>> >
>> > kafka.common.UnknownTopicOrPartitionException
>> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>> > at
>> > sun.reflect.NativeConstructorAccessorImpl.newInstance(Native
>> ConstructorAccessorImpl.java:62)
>> > at
>> > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(De
>> legatingConstructorAccessorImpl.java:45)
>> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> > at java.lang.Class.newInstance(Class.java:442)
>> > at kafka.common.ErrorMapping$.exceptionFor(ErrorMapping.scala:102)
>> > at
>> > org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.h
>> andleFetchErr(KafkaRDD.scala:184)
>> > at
>> > org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.f
>> etchBatch(KafkaRDD.scala:193)
>> > at
>> > org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.g
>> etNext(KafkaRDD.scala:208)
>> > at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)
>> > at
>> > scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wr
>> appers.scala:29)
>> >
>> >
>>
>
>

Re: Error while Spark 1.6.1 streaming from Kafka-2.11_0.10.0.1 cluster

Posted by Cody Koeninger <co...@koeninger.org>.
Do you have the ability to try using Spark 2.0 with the
streaming-kafka-0-10 connector?

I'd expect the 1.6.1 version to be compatible with kafka 0.10, but it
would be good to rule that out.

On Thu, Sep 22, 2016 at 1:37 PM, sagarcasual . <sa...@gmail.com> wrote:
> Hello,
>
> I am trying to stream data out of kafka cluster (2.11_0.10.0.1) using Spark
> 1.6.1
> I am receiving following error, and I confirmed that Topic to which I am
> trying to connect exists with the data .
>
> Any idea what could be the case?
>
> kafka.common.UnknownTopicOrPartitionException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at java.lang.Class.newInstance(Class.java:442)
> at kafka.common.ErrorMapping$.exceptionFor(ErrorMapping.scala:102)
> at
> org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.handleFetchErr(KafkaRDD.scala:184)
> at
> org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.fetchBatch(KafkaRDD.scala:193)
> at
> org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.getNext(KafkaRDD.scala:208)
> at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)
> at
> scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:29)
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org