You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Jongyoul Lee <jo...@gmail.com> on 2016/11/02 05:33:20 UTC

Re: spark streaming with Kafka

Zeppelin currently propagates a jar including `SparkInterprer`. Thus if you
add some dependencies into interpreter tab of Spark, that jar doesn't pass
to executors so that error may occurs. How about using --packages option?

On Mon, Oct 24, 2016 at 10:32 PM, herman.yu@teeupdata.com <
herman.yu@teeupdata.com> wrote:

> yes, yarn-client model.
>
> In situation of spark streaming, does zeppelin support graphs that are
> automatically update themselves, like a self-updated dashboard on top of
> streaming based spark tables?
>
> Thanks
> Herman.
>
>
> On Oct 21, 2016, at 23:42, Jongyoul Lee <jo...@gmail.com> wrote:
>
> Hi,
>
> Do you use yarn-client mode of Spark?
>
> On Friday, 21 October 2016, herman.yu@teeupdata.com <
> herman.yu@teeupdata.com> wrote:
>
>> Hi Everyone,
>>
>> Does zeppelin support spark streaming with kafka? I am using zeppelin
>> 0.6.1 with spark 2.0 and kafka 0.10.0.0.
>>
>> I got error when import org.apache.spark.streaming.kafka.KafkaUtils
>> <console>:36: error: object kafka is not a member of package
>> org.apache.spark.streaming
>> import org.apache.spark.streaming.kafka.KafkaUtils
>>
>> I already added the spark streaming kafka jar to the dependencies of
>> spark interpreter.
>>
>> If it is supported, is there a tutorial/sample notebook?
>>
>> Thanks
>> Herman.
>>
>>
>>
>>
>
> --
> 이종열, Jongyoul Lee, 李宗烈
> http://madeng.net
>
>
>


-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Re: spark streaming with Kafka

Posted by Jongyoul Lee <jo...@gmail.com>.
Can you share your script? In my understand, Zeppelin already created
SparkContext when it starts, thus you don't need to and must not make new
sc by yourself.

Can you please check it?

On Wed, Nov 2, 2016 at 5:32 PM, Mich Talebzadeh <mi...@gmail.com>
wrote:

> This is a good question.
>
> Normally I create a streaming app (in Scala) using mvn or sbt with a Uber
> jar file and run that with dependencies. Tried to run the source code in
> Zeppelin after adding /home/hduser/jars/spark-stream
> ing-kafka-assembly_2.10-1.6.1.jar to dependencies but it did not work.
>
> The problem is that you already a spark session running as seen below in
> zeppelin's spark log
>
>  WARN [2016-11-02 08:23:26,559] ({pool-2-thread-10}
> Logging.scala[logWarning]:66) - Another SparkContext is being constructed
> (or threw an exception in its constructor).  This may indicate an error,
> since only one SparkContext may be running in this JVM (see SPARK-2243).
> The other SparkContext was created at:
>
> So I am not Zeppelin what can do this through Zeppelin
>
> HTH
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On 2 November 2016 at 05:33, Jongyoul Lee <jo...@gmail.com> wrote:
>
>> Zeppelin currently propagates a jar including `SparkInterprer`. Thus if
>> you add some dependencies into interpreter tab of Spark, that jar doesn't
>> pass to executors so that error may occurs. How about using --packages
>> option?
>>
>> On Mon, Oct 24, 2016 at 10:32 PM, herman.yu@teeupdata.com <
>> herman.yu@teeupdata.com> wrote:
>>
>>> yes, yarn-client model.
>>>
>>> In situation of spark streaming, does zeppelin support graphs that are
>>> automatically update themselves, like a self-updated dashboard on top of
>>> streaming based spark tables?
>>>
>>> Thanks
>>> Herman.
>>>
>>>
>>> On Oct 21, 2016, at 23:42, Jongyoul Lee <jo...@gmail.com> wrote:
>>>
>>> Hi,
>>>
>>> Do you use yarn-client mode of Spark?
>>>
>>> On Friday, 21 October 2016, herman.yu@teeupdata.com <
>>> herman.yu@teeupdata.com> wrote:
>>>
>>>> Hi Everyone,
>>>>
>>>> Does zeppelin support spark streaming with kafka? I am using zeppelin
>>>> 0.6.1 with spark 2.0 and kafka 0.10.0.0.
>>>>
>>>> I got error when import org.apache.spark.streaming.kafka.KafkaUtils
>>>> <console>:36: error: object kafka is not a member of package
>>>> org.apache.spark.streaming
>>>> import org.apache.spark.streaming.kafka.KafkaUtils
>>>>
>>>> I already added the spark streaming kafka jar to the dependencies of
>>>> spark interpreter.
>>>>
>>>> If it is supported, is there a tutorial/sample notebook?
>>>>
>>>> Thanks
>>>> Herman.
>>>>
>>>>
>>>>
>>>>
>>>
>>> --
>>> 이종열, Jongyoul Lee, 李宗烈
>>> http://madeng.net
>>>
>>>
>>>
>>
>>
>> --
>> 이종열, Jongyoul Lee, 李宗烈
>> http://madeng.net
>>
>
>


-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Re: spark streaming with Kafka

Posted by Mich Talebzadeh <mi...@gmail.com>.
This is a good question.

Normally I create a streaming app (in Scala) using mvn or sbt with a Uber
jar file and run that with dependencies. Tried to run the source code in
Zeppelin after adding
/home/hduser/jars/spark-streaming-kafka-assembly_2.10-1.6.1.jar
to dependencies but it did not work.

The problem is that you already a spark session running as seen below in
zeppelin's spark log

 WARN [2016-11-02 08:23:26,559] ({pool-2-thread-10}
Logging.scala[logWarning]:66) - Another SparkContext is being constructed
(or threw an exception in its constructor).  This may indicate an error,
since only one SparkContext may be running in this JVM (see SPARK-2243).
The other SparkContext was created at:

So I am not Zeppelin what can do this through Zeppelin

HTH


Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 2 November 2016 at 05:33, Jongyoul Lee <jo...@gmail.com> wrote:

> Zeppelin currently propagates a jar including `SparkInterprer`. Thus if
> you add some dependencies into interpreter tab of Spark, that jar doesn't
> pass to executors so that error may occurs. How about using --packages
> option?
>
> On Mon, Oct 24, 2016 at 10:32 PM, herman.yu@teeupdata.com <
> herman.yu@teeupdata.com> wrote:
>
>> yes, yarn-client model.
>>
>> In situation of spark streaming, does zeppelin support graphs that are
>> automatically update themselves, like a self-updated dashboard on top of
>> streaming based spark tables?
>>
>> Thanks
>> Herman.
>>
>>
>> On Oct 21, 2016, at 23:42, Jongyoul Lee <jo...@gmail.com> wrote:
>>
>> Hi,
>>
>> Do you use yarn-client mode of Spark?
>>
>> On Friday, 21 October 2016, herman.yu@teeupdata.com <
>> herman.yu@teeupdata.com> wrote:
>>
>>> Hi Everyone,
>>>
>>> Does zeppelin support spark streaming with kafka? I am using zeppelin
>>> 0.6.1 with spark 2.0 and kafka 0.10.0.0.
>>>
>>> I got error when import org.apache.spark.streaming.kafka.KafkaUtils
>>> <console>:36: error: object kafka is not a member of package
>>> org.apache.spark.streaming
>>> import org.apache.spark.streaming.kafka.KafkaUtils
>>>
>>> I already added the spark streaming kafka jar to the dependencies of
>>> spark interpreter.
>>>
>>> If it is supported, is there a tutorial/sample notebook?
>>>
>>> Thanks
>>> Herman.
>>>
>>>
>>>
>>>
>>
>> --
>> 이종열, Jongyoul Lee, 李宗烈
>> http://madeng.net
>>
>>
>>
>
>
> --
> 이종열, Jongyoul Lee, 李宗烈
> http://madeng.net
>