You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by kant kodali <ka...@gmail.com> on 2018/02/25 02:36:02 UTC

Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase

Hi,

I couldn't get flink and kafka working together. It looks like all examples
I tried from web site fails with the following Exception.

Caused by: java.lang.ClassNotFoundException:
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase


*or when I do something like this like it is in the website*


 val stream = senv.addSource(new FlinkKafkaConsumer08[String]("join_test",
new SimpleStringSchema(), properties)).print()

*I get the following exception*

<console>:73: error: overloaded method value addSource with alternatives:

  [T](function:
org.apache.flink.streaming.api.functions.source.SourceFunction.SourceContext[T]
=> Unit)(implicit evidence$10:
org.apache.flink.api.common.typeinfo.TypeInformation[T])org.apache.flink.streaming.api.scala.DataStream[T]
<and>

  [T](function:
org.apache.flink.streaming.api.functions.source.SourceFunction[T])(implicit
evidence$9:
org.apache.flink.api.common.typeinfo.TypeInformation[T])org.apache.flink.streaming.api.scala.DataStream[T]

 cannot be applied to
(org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08[String])

       val stream = senv.addSource(new
FlinkKafkaConsumer08[String]("join_test", new SimpleStringSchema(),
properties)).print()

can anyone share a simple example of how to get Kafka Stream as a Table
using scala shell? No need for any fancy schema just needs to print the
value. I am using the latest version of flink 1.41 and my lib folder
containers flink-connector-kafka-0.8_2.11-1.4.1.jar

I wanted to use Kafka 0.9 but that didn't work so I thought let me just get
something working first and downgraded to 0.8 but 0.8 examples on the
website also don't seem to work using scala shell.

Thanks!!

Re: Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase

Posted by kant kodali <ka...@gmail.com>.
Thanks a lot!

On Mon, Feb 26, 2018 at 9:19 AM, Nico Kruber <ni...@data-artisans.com> wrote:

> Judging from the code, you should separate different jars with a colon
> ":", i.e. "—addclasspath jar1:jar2"
>
>
> Nico
>
> On 26/02/18 10:36, kant kodali wrote:
> > Hi Gordon,
> >
> > Thanks for the response!! How do I add multiple jars to the classpaths?
> > Are they separated by a semicolon and still using one flag like
> > "—addclasspath jar1; jar2" or specify the flag multiple times like
> > "—addclasspath jar1 —addclasspath jar2" or specify just the directory
> > "—addclasspath ./opt" so it adds all the jars in that directory!
> >
> > Thanks!
> >
> > On Sun, Feb 25, 2018 at 11:29 PM, Tzu-Li (Gordon) Tai
> > <tzulitai@apache.org <ma...@apache.org>> wrote:
> >
> >     Hi,
> >
> >     Good to see that you have it working! Yes, each of the Kafka
> >     version-specific connectors also have a dependency on the base Kafka
> >     connector module.
> >
> >     Note that it is usually not recommended to put optional dependencies
> >     (such as the connectors) under the lib folder.
> >     To add additional dependencies when using the Scala shell, there is
> >     a “—addclasspath” option which allows you to specify paths to the
> >     dependency jars.
> >
> >     Cheers,
> >     Gordon
> >
> >
> >     On 25 February 2018 at 12:22:28 PM, kant kodali (kanth909@gmail.com
> >     <ma...@gmail.com>) wrote:
> >
> >>     Exception went away after
> >>     downloading flink-connector-kafka-base_2.11-1.4.1.jar to lib folder
> >>
> >>     On Sat, Feb 24, 2018 at 6:36 PM, kant kodali <kanth909@gmail.com
> >>     <ma...@gmail.com>> wrote:
> >>
> >>         Hi,
> >>
> >>         I couldn't get flink and kafka working together. It looks like
> >>         all examples I tried from web site fails with the following
> >>         Exception.
> >>
> >>         Caused by: java.lang.ClassNotFoundException:
> >>         org.apache.flink.streaming.connectors.kafka.
> FlinkKafkaConsumerBase
> >>
> >>
> >>         *or when I do something like this like it is in the website*
> >>
> >>
> >>          val stream = senv.addSource(new
> >>         FlinkKafkaConsumer08[String]("join_test", new
> >>         SimpleStringSchema(), properties)).print()
> >>
> >>         *I get the following exception*
> >>
> >>         <console>:73: error: overloaded method value addSource with
> >>         alternatives:
> >>
> >>           [T](function:
> >>         org.apache.flink.streaming.api.functions.source.
> SourceFunction.SourceContext[T]
> >>         => Unit)(implicit evidence$10:
> >>         org.apache.flink.api.common.typeinfo.TypeInformation[T])
> org.apache.flink.streaming.api.scala.DataStream[T]
> >>         <and>
> >>
> >>           [T](function:
> >>         org.apache.flink.streaming.api.functions.source.
> SourceFunction[T])(implicit
> >>         evidence$9:
> >>         org.apache.flink.api.common.typeinfo.TypeInformation[T])
> org.apache.flink.streaming.api.scala.DataStream[T]
> >>
> >>          cannot be applied to (org.apache.flink.streaming.co
> >>         <http://org.apache.flink.streaming.co>nnectors.kafka.
> FlinkKafkaConsumer08[String])
> >>
> >>                val stream = senv.addSource(new
> >>         FlinkKafkaConsumer08[String]("join_test", new
> >>         SimpleStringSchema(), properties)).print()
> >>
> >>
> >>         can anyone share a simple example of how to get Kafka Stream
> >>         as a Table using scala shell? No need for any fancy schema
> >>         just needs to print the value. I am using the latest version
> >>         of flink 1.41 and my lib folder
> >>         containers flink-connector-kafka-0.8_2.11-1.4.1.jar
> >>
> >>         I wanted to use Kafka 0.9 but that didn't work so I thought
> >>         let me just get something working first and downgraded to 0.8
> >>         but 0.8 examples on the website also don't seem to work using
> >>         scala shell.
> >>
> >>         Thanks!!
> >>
> >>
> >>
> >>
> >
>
>

Re: Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase

Posted by Nico Kruber <ni...@data-artisans.com>.
Judging from the code, you should separate different jars with a colon
":", i.e. "—addclasspath jar1:jar2"


Nico

On 26/02/18 10:36, kant kodali wrote:
> Hi Gordon,
> 
> Thanks for the response!! How do I add multiple jars to the classpaths?
> Are they separated by a semicolon and still using one flag like
> "—addclasspath jar1; jar2" or specify the flag multiple times like
> "—addclasspath jar1 —addclasspath jar2" or specify just the directory
> "—addclasspath ./opt" so it adds all the jars in that directory!
> 
> Thanks!
> 
> On Sun, Feb 25, 2018 at 11:29 PM, Tzu-Li (Gordon) Tai
> <tzulitai@apache.org <ma...@apache.org>> wrote:
> 
>     Hi,
> 
>     Good to see that you have it working! Yes, each of the Kafka
>     version-specific connectors also have a dependency on the base Kafka
>     connector module.
> 
>     Note that it is usually not recommended to put optional dependencies
>     (such as the connectors) under the lib folder.
>     To add additional dependencies when using the Scala shell, there is
>     a “—addclasspath” option which allows you to specify paths to the
>     dependency jars.
> 
>     Cheers,
>     Gordon
> 
> 
>     On 25 February 2018 at 12:22:28 PM, kant kodali (kanth909@gmail.com
>     <ma...@gmail.com>) wrote:
> 
>>     Exception went away after
>>     downloading flink-connector-kafka-base_2.11-1.4.1.jar to lib folder
>>
>>     On Sat, Feb 24, 2018 at 6:36 PM, kant kodali <kanth909@gmail.com
>>     <ma...@gmail.com>> wrote:
>>
>>         Hi,
>>
>>         I couldn't get flink and kafka working together. It looks like
>>         all examples I tried from web site fails with the following
>>         Exception.
>>
>>         Caused by: java.lang.ClassNotFoundException:
>>         org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase
>>
>>
>>         *or when I do something like this like it is in the website*
>>
>>
>>          val stream = senv.addSource(new
>>         FlinkKafkaConsumer08[String]("join_test", new
>>         SimpleStringSchema(), properties)).print()
>>
>>         *I get the following exception*
>>
>>         <console>:73: error: overloaded method value addSource with
>>         alternatives:
>>
>>           [T](function:
>>         org.apache.flink.streaming.api.functions.source.SourceFunction.SourceContext[T]
>>         => Unit)(implicit evidence$10:
>>         org.apache.flink.api.common.typeinfo.TypeInformation[T])org.apache.flink.streaming.api.scala.DataStream[T]
>>         <and>
>>
>>           [T](function:
>>         org.apache.flink.streaming.api.functions.source.SourceFunction[T])(implicit
>>         evidence$9:
>>         org.apache.flink.api.common.typeinfo.TypeInformation[T])org.apache.flink.streaming.api.scala.DataStream[T]
>>
>>          cannot be applied to (org.apache.flink.streaming.co
>>         <http://org.apache.flink.streaming.co>nnectors.kafka.FlinkKafkaConsumer08[String])
>>
>>                val stream = senv.addSource(new
>>         FlinkKafkaConsumer08[String]("join_test", new
>>         SimpleStringSchema(), properties)).print()
>>
>>
>>         can anyone share a simple example of how to get Kafka Stream
>>         as a Table using scala shell? No need for any fancy schema
>>         just needs to print the value. I am using the latest version
>>         of flink 1.41 and my lib folder
>>         containers flink-connector-kafka-0.8_2.11-1.4.1.jar 
>>
>>         I wanted to use Kafka 0.9 but that didn't work so I thought
>>         let me just get something working first and downgraded to 0.8
>>         but 0.8 examples on the website also don't seem to work using
>>         scala shell. 
>>
>>         Thanks!!
>>
>>
>>
>>
> 


Re: Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase

Posted by kant kodali <ka...@gmail.com>.
Hi Gordon,

Thanks for the response!! How do I add multiple jars to the classpaths? Are
they separated by a semicolon and still using one flag like "—addclasspath
jar1; jar2" or specify the flag multiple times like "—addclasspath
jar1 —addclasspath
jar2" or specify just the directory "—addclasspath ./opt" so it adds all
the jars in that directory!

Thanks!

On Sun, Feb 25, 2018 at 11:29 PM, Tzu-Li (Gordon) Tai <tz...@apache.org>
wrote:

> Hi,
>
> Good to see that you have it working! Yes, each of the Kafka
> version-specific connectors also have a dependency on the base Kafka
> connector module.
>
> Note that it is usually not recommended to put optional dependencies (such
> as the connectors) under the lib folder.
> To add additional dependencies when using the Scala shell, there is a
> “—addclasspath” option which allows you to specify paths to the dependency
> jars.
>
> Cheers,
> Gordon
>
>
> On 25 February 2018 at 12:22:28 PM, kant kodali (kanth909@gmail.com)
> wrote:
>
> Exception went away after downloading flink-connector-kafka-base_2.11-1.4.1.jar
> to lib folder
>
> On Sat, Feb 24, 2018 at 6:36 PM, kant kodali <ka...@gmail.com> wrote:
>
>> Hi,
>>
>> I couldn't get flink and kafka working together. It looks like all
>> examples I tried from web site fails with the following Exception.
>>
>> Caused by: java.lang.ClassNotFoundException:
>> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase
>>
>>
>> *or when I do something like this like it is in the website*
>>
>>
>>  val stream = senv.addSource(new FlinkKafkaConsumer08[String]("join_test",
>> new SimpleStringSchema(), properties)).print()
>>
>> *I get the following exception*
>>
>> <console>:73: error: overloaded method value addSource with alternatives:
>>
>>   [T](function: org.apache.flink.streaming.api
>> .functions.source.SourceFunction.SourceContext[T] => Unit)(implicit
>> evidence$10: org.apache.flink.api.common.typeinfo.TypeInformation[T])org.
>> apache.flink.streaming.api.scala.DataStream[T] <and>
>>
>>   [T](function: org.apache.flink.streaming.api
>> .functions.source.SourceFunction[T])(implicit evidence$9:
>> org.apache.flink.api.common.typeinfo.TypeInformation[T])org.
>> apache.flink.streaming.api.scala.DataStream[T]
>>
>>  cannot be applied to (org.apache.flink.streaming.co
>> nnectors.kafka.FlinkKafkaConsumer08[String])
>>
>>        val stream = senv.addSource(new FlinkKafkaConsumer08[String]("join_test",
>> new SimpleStringSchema(), properties)).print()
>>
>> can anyone share a simple example of how to get Kafka Stream as a Table
>> using scala shell? No need for any fancy schema just needs to print the
>> value. I am using the latest version of flink 1.41 and my lib folder
>> containers flink-connector-kafka-0.8_2.11-1.4.1.jar
>>
>> I wanted to use Kafka 0.9 but that didn't work so I thought let me just
>> get something working first and downgraded to 0.8 but 0.8 examples on the
>> website also don't seem to work using scala shell.
>>
>> Thanks!!
>>
>>
>>
>>
>

Re: Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase

Posted by "Tzu-Li (Gordon) Tai" <tz...@apache.org>.
Hi,

Good to see that you have it working! Yes, each of the Kafka version-specific connectors also have a dependency on the base Kafka connector module.

Note that it is usually not recommended to put optional dependencies (such as the connectors) under the lib folder.
To add additional dependencies when using the Scala shell, there is a “—addclasspath” option which allows you to specify paths to the dependency jars.

Cheers,
Gordon

On 25 February 2018 at 12:22:28 PM, kant kodali (kanth909@gmail.com) wrote:

Exception went away after downloading flink-connector-kafka-base_2.11-1.4.1.jar to lib folder

On Sat, Feb 24, 2018 at 6:36 PM, kant kodali <ka...@gmail.com> wrote:
Hi,

I couldn't get flink and kafka working together. It looks like all examples I tried from web site fails with the following Exception.

Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase

or when I do something like this like it is in the website

 val stream = senv.addSource(new FlinkKafkaConsumer08[String]("join_test", new SimpleStringSchema(), properties)).print()

I get the following exception

<console>:73: error: overloaded method value addSource with alternatives:
  [T](function: org.apache.flink.streaming.api.functions.source.SourceFunction.SourceContext[T] => Unit)(implicit evidence$10: org.apache.flink.api.common.typeinfo.TypeInformation[T])org.apache.flink.streaming.api.scala.DataStream[T] <and>
  [T](function: org.apache.flink.streaming.api.functions.source.SourceFunction[T])(implicit evidence$9: org.apache.flink.api.common.typeinfo.TypeInformation[T])org.apache.flink.streaming.api.scala.DataStream[T]
 cannot be applied to (org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08[String])
       val stream = senv.addSource(new FlinkKafkaConsumer08[String]("join_test", new SimpleStringSchema(), properties)).print()

can anyone share a simple example of how to get Kafka Stream as a Table using scala shell? No need for any fancy schema just needs to print the value. I am using the latest version of flink 1.41 and my lib folder containers flink-connector-kafka-0.8_2.11-1.4.1.jar 

I wanted to use Kafka 0.9 but that didn't work so I thought let me just get something working first and downgraded to 0.8 but 0.8 examples on the website also don't seem to work using scala shell. 

Thanks!!





Re: Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase

Posted by kant kodali <ka...@gmail.com>.
Exception went away after downloading flink-connector-kafka-base_2.11-1.4.1.jar
to lib folder

On Sat, Feb 24, 2018 at 6:36 PM, kant kodali <ka...@gmail.com> wrote:

> Hi,
>
> I couldn't get flink and kafka working together. It looks like all
> examples I tried from web site fails with the following Exception.
>
> Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.
> connectors.kafka.FlinkKafkaConsumerBase
>
>
> *or when I do something like this like it is in the website*
>
>
>  val stream = senv.addSource(new FlinkKafkaConsumer08[String]("join_test",
> new SimpleStringSchema(), properties)).print()
>
> *I get the following exception*
>
> <console>:73: error: overloaded method value addSource with alternatives:
>
>   [T](function: org.apache.flink.streaming.api.functions.source.
> SourceFunction.SourceContext[T] => Unit)(implicit evidence$10:
> org.apache.flink.api.common.typeinfo.TypeInformation[T])
> org.apache.flink.streaming.api.scala.DataStream[T] <and>
>
>   [T](function: org.apache.flink.streaming.api.functions.source.SourceFunction[T])(implicit
> evidence$9: org.apache.flink.api.common.typeinfo.TypeInformation[T])
> org.apache.flink.streaming.api.scala.DataStream[T]
>
>  cannot be applied to (org.apache.flink.streaming.connectors.kafka.
> FlinkKafkaConsumer08[String])
>
>        val stream = senv.addSource(new FlinkKafkaConsumer08[String]("join_test",
> new SimpleStringSchema(), properties)).print()
>
> can anyone share a simple example of how to get Kafka Stream as a Table
> using scala shell? No need for any fancy schema just needs to print the
> value. I am using the latest version of flink 1.41 and my lib folder
> containers flink-connector-kafka-0.8_2.11-1.4.1.jar
>
> I wanted to use Kafka 0.9 but that didn't work so I thought let me just
> get something working first and downgraded to 0.8 but 0.8 examples on the
> website also don't seem to work using scala shell.
>
> Thanks!!
>
>
>
>