You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by "Balachandar R.A." <ba...@gmail.com> on 2016/01/21 09:41:29 UTC

Providing third party jar files to spark

Hello

My spark based map tasks needs to access third party jar files. I found
below options to submit third party jar files to spark interpreter

1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma seprated> in
conf/zeppelin-env.sh

2. include the statement spark.jars  <all the jar files with comma
separated> in <spark>?conf/spark-defaults.conf

3. use the z.load("the location of jar file in the local filesystem") in
zepelin notebook

I could test the first two and they both works fine. The third one does not
work. Here is the snippet i use

%dep
z.reset()
z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")


Further, the import of class belongs to the above jar file is working when
I use the statement import com.....  in zeppelin notebook. However, I get
the class not found exception in the executor for the same class.

Any clue here would help greatly


regards
Bala

Re: Providing third party jar files to spark

Posted by "Balachandar R.A." <ba...@gmail.com>.
Hello

I could fix this issue. It was only the problem of the URL of the file. I
was using like below

z.load("file:///home/bala/mapreduce.jar")

But, when I omitted file://  from the above url, it worked

z.load("/home/bala/mapreduce.jar")


regards
Bala

On 26 January 2016 at 08:23, Balachandar R.A. <ba...@gmail.com>
wrote:

> Hi falmeida,
>
> Thanks for the response but I do not want to use
> SPARK-SUBMIT-OPTIONS
>
> thanks and regards
> Bala
> On 26-Jan-2016 8:05 am, "Felipe Almeida" <fa...@gmail.com> wrote:
>
>> Hi Balachandar I think I just went through that very same problem and I
>> solved it with the help of *Moon Soo Lee:*
>>
>> Here is the solution:
>> http://stackoverflow.com/questions/35005455/java-npe-when-loading-a-dependency-from-maven-from-within-zeppelin-on-aws-emr
>>
>> On 25 January 2016 at 14:27, Hyung Sung Shim <hs...@nflabs.com> wrote:
>>
>>> Hello.
>>> I also don't work the loadAndDist() method. Maybe it's deprecated.
>>> I'll check and fix the documents.
>>>
>>> using spark-shell, you can run your application as following step.
>>> 1. remove specified jar configuration in the spark-defaults.conf
>>> 2. in the spark-home, bin/spark-shell* --jars "YOUR JAR COMMA SPERATED"
>>> *
>>>
>>> If you share your application code and environments informations(zeppelin
>>> and spark version you're using, and zeppelin-env.sh etc..), i might help
>>> you.
>>>
>>> Thanks.
>>>
>>> <https://gitlab.com/search?group_id=&project_id=769187&scope=issues&search=spark-shell#2-function-define>
>>>
>>>
>>>
>>>
>>> 2016-01-25 19:08 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>>>
>>>> Hello,
>>>>
>>>> I tried to use z.loadAndDist() but it says
>>>>
>>>> console>:17: error: value loadAndDist is not a member of
>>>> org.apache.zeppelin.spark.dep.DependencyContext
>>>>
>>>> Any idea here what this method is for?
>>>>
>>>>
>>>> regards
>>>> Bala
>>>>
>>>> On 25 January 2016 at 15:34, Balachandar R.A. <balachandar.ra@gmail.com
>>>> > wrote:
>>>>
>>>>> Hello,
>>>>>
>>>>> I have run the code in spark-shell successfully but the jar files were
>>>>> all specified in the config files (spark-defaults.conf). However, I will
>>>>> not be able to use z.load() in spark-shell. Isn't? I am sorry but I did not
>>>>> pick up the idea of running using spark-shell. Wail suggestion is to create
>>>>> a fatJar? I will give it a try but still how do i make sure this fatJar is
>>>>> accessible to spark executors? ANyway, I will keep you posted on this
>>>>>
>>>>> regards
>>>>> Bala
>>>>>
>>>>> On 25 January 2016 at 13:39, Hyung Sung Shim <hs...@nflabs.com>
>>>>> wrote:
>>>>>
>>>>>> Hello.
>>>>>> I think Wail Alkowaileet's comment is possible.
>>>>>> Balachandar, Could you try to run your application with spark-shell?
>>>>>>
>>>>>>
>>>>>> 2016-01-25 15:45 GMT+09:00 Wail Alkowaileet <wa...@gmail.com>:
>>>>>>
>>>>>>> I used z.load in my case and it seems to be working just fine.
>>>>>>> Can you try spark-shell with your jar file? and see what is the
>>>>>>> error?
>>>>>>>
>>>>>>> I assume the problem that your application requires third-party
>>>>>>> jars. Therefore, you need to build your app with 'assembly'.
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Jan 25, 2016 at 9:39 AM, Balachandar R.A. <
>>>>>>> balachandar.ra@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hello Hyung,
>>>>>>>>
>>>>>>>> There is nothig I could make out from error log as it is plain
>>>>>>>> straightforward that classNotFoundException
>>>>>>>>
>>>>>>>> On 25 January 2016 at 11:34, Hyung Sung Shim <hs...@nflabs.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> It's weird..so Could you send the error log for details?
>>>>>>>>>
>>>>>>>>> 2016-01-25 15:00 GMT+09:00 Balachandar R.A. <
>>>>>>>>> balachandar.ra@gmail.com>:
>>>>>>>>>
>>>>>>>>>> Hi Hyung,
>>>>>>>>>>
>>>>>>>>>> Thanks for the response. This I have tried but did not work.
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>> Bala
>>>>>>>>>>
>>>>>>>>>> On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com>
>>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hello. Balachandar.
>>>>>>>>>>> In case of third one that you've tried, It must be first
>>>>>>>>>>> executed in the notebook.
>>>>>>>>>>> Could you try restart the zeppelin and run first the "%dep
>>>>>>>>>>> z.load()" paragraph?
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <
>>>>>>>>>>> balachandar.ra@gmail.com>:
>>>>>>>>>>>
>>>>>>>>>>>> Hi
>>>>>>>>>>>>
>>>>>>>>>>>> Any help would be greatly appreciated :-)
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ---------- Forwarded message ----------
>>>>>>>>>>>> From: Balachandar R.A. <ba...@gmail.com>
>>>>>>>>>>>> Date: 21 January 2016 at 14:11
>>>>>>>>>>>> Subject: Providing third party jar files to spark
>>>>>>>>>>>> To: users@zeppelin.incubator.apache.org
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> Hello
>>>>>>>>>>>>
>>>>>>>>>>>> My spark based map tasks needs to access third party jar files.
>>>>>>>>>>>> I found below options to submit third party jar files to spark interpreter
>>>>>>>>>>>>
>>>>>>>>>>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma
>>>>>>>>>>>> seprated> in conf/zeppelin-env.sh
>>>>>>>>>>>>
>>>>>>>>>>>> 2. include the statement spark.jars  <all the jar files with
>>>>>>>>>>>> comma separated> in <spark>?conf/spark-defaults.conf
>>>>>>>>>>>>
>>>>>>>>>>>> 3. use the z.load("the location of jar file in the local
>>>>>>>>>>>> filesystem") in zepelin notebook
>>>>>>>>>>>>
>>>>>>>>>>>> I could test the first two and they both works fine. The third
>>>>>>>>>>>> one does not work. Here is the snippet i use
>>>>>>>>>>>>
>>>>>>>>>>>> %dep
>>>>>>>>>>>> z.reset()
>>>>>>>>>>>>
>>>>>>>>>>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> Further, the import of class belongs to the above jar file is
>>>>>>>>>>>> working when I use the statement import com.....  in zeppelin notebook.
>>>>>>>>>>>> However, I get the class not found exception in the executor for the same
>>>>>>>>>>>> class.
>>>>>>>>>>>>
>>>>>>>>>>>> Any clue here would help greatly
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> regards
>>>>>>>>>>>> Bala
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>>
>>>>>>> *Regards,*
>>>>>>> Wail Alkowaileet
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>> “Every time you stay out late; every time you sleep in; every time you
>> miss a workout; every time you don’t give 100% – You make it that much
>> easier for me to beat you.” - Unknown author
>>
>

Re: Providing third party jar files to spark

Posted by "Balachandar R.A." <ba...@gmail.com>.
Hi falmeida,

Thanks for the response but I do not want to use
SPARK-SUBMIT-OPTIONS

thanks and regards
Bala
On 26-Jan-2016 8:05 am, "Felipe Almeida" <fa...@gmail.com> wrote:

> Hi Balachandar I think I just went through that very same problem and I
> solved it with the help of *Moon Soo Lee:*
>
> Here is the solution:
> http://stackoverflow.com/questions/35005455/java-npe-when-loading-a-dependency-from-maven-from-within-zeppelin-on-aws-emr
>
> On 25 January 2016 at 14:27, Hyung Sung Shim <hs...@nflabs.com> wrote:
>
>> Hello.
>> I also don't work the loadAndDist() method. Maybe it's deprecated.
>> I'll check and fix the documents.
>>
>> using spark-shell, you can run your application as following step.
>> 1. remove specified jar configuration in the spark-defaults.conf
>> 2. in the spark-home, bin/spark-shell* --jars "YOUR JAR COMMA SPERATED" *
>>
>> If you share your application code and environments informations(zeppelin
>> and spark version you're using, and zeppelin-env.sh etc..), i might help
>> you.
>>
>> Thanks.
>>
>> <https://gitlab.com/search?group_id=&project_id=769187&scope=issues&search=spark-shell#2-function-define>
>>
>>
>>
>>
>> 2016-01-25 19:08 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>>
>>> Hello,
>>>
>>> I tried to use z.loadAndDist() but it says
>>>
>>> console>:17: error: value loadAndDist is not a member of
>>> org.apache.zeppelin.spark.dep.DependencyContext
>>>
>>> Any idea here what this method is for?
>>>
>>>
>>> regards
>>> Bala
>>>
>>> On 25 January 2016 at 15:34, Balachandar R.A. <ba...@gmail.com>
>>> wrote:
>>>
>>>> Hello,
>>>>
>>>> I have run the code in spark-shell successfully but the jar files were
>>>> all specified in the config files (spark-defaults.conf). However, I will
>>>> not be able to use z.load() in spark-shell. Isn't? I am sorry but I did not
>>>> pick up the idea of running using spark-shell. Wail suggestion is to create
>>>> a fatJar? I will give it a try but still how do i make sure this fatJar is
>>>> accessible to spark executors? ANyway, I will keep you posted on this
>>>>
>>>> regards
>>>> Bala
>>>>
>>>> On 25 January 2016 at 13:39, Hyung Sung Shim <hs...@nflabs.com> wrote:
>>>>
>>>>> Hello.
>>>>> I think Wail Alkowaileet's comment is possible.
>>>>> Balachandar, Could you try to run your application with spark-shell?
>>>>>
>>>>>
>>>>> 2016-01-25 15:45 GMT+09:00 Wail Alkowaileet <wa...@gmail.com>:
>>>>>
>>>>>> I used z.load in my case and it seems to be working just fine.
>>>>>> Can you try spark-shell with your jar file? and see what is the error?
>>>>>>
>>>>>> I assume the problem that your application requires third-party jars.
>>>>>> Therefore, you need to build your app with 'assembly'.
>>>>>>
>>>>>>
>>>>>> On Mon, Jan 25, 2016 at 9:39 AM, Balachandar R.A. <
>>>>>> balachandar.ra@gmail.com> wrote:
>>>>>>
>>>>>>> Hello Hyung,
>>>>>>>
>>>>>>> There is nothig I could make out from error log as it is plain
>>>>>>> straightforward that classNotFoundException
>>>>>>>
>>>>>>> On 25 January 2016 at 11:34, Hyung Sung Shim <hs...@nflabs.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> It's weird..so Could you send the error log for details?
>>>>>>>>
>>>>>>>> 2016-01-25 15:00 GMT+09:00 Balachandar R.A. <
>>>>>>>> balachandar.ra@gmail.com>:
>>>>>>>>
>>>>>>>>> Hi Hyung,
>>>>>>>>>
>>>>>>>>> Thanks for the response. This I have tried but did not work.
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>> Bala
>>>>>>>>>
>>>>>>>>> On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> Hello. Balachandar.
>>>>>>>>>> In case of third one that you've tried, It must be first executed
>>>>>>>>>> in the notebook.
>>>>>>>>>> Could you try restart the zeppelin and run first the "%dep
>>>>>>>>>> z.load()" paragraph?
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <
>>>>>>>>>> balachandar.ra@gmail.com>:
>>>>>>>>>>
>>>>>>>>>>> Hi
>>>>>>>>>>>
>>>>>>>>>>> Any help would be greatly appreciated :-)
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ---------- Forwarded message ----------
>>>>>>>>>>> From: Balachandar R.A. <ba...@gmail.com>
>>>>>>>>>>> Date: 21 January 2016 at 14:11
>>>>>>>>>>> Subject: Providing third party jar files to spark
>>>>>>>>>>> To: users@zeppelin.incubator.apache.org
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Hello
>>>>>>>>>>>
>>>>>>>>>>> My spark based map tasks needs to access third party jar files.
>>>>>>>>>>> I found below options to submit third party jar files to spark interpreter
>>>>>>>>>>>
>>>>>>>>>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma
>>>>>>>>>>> seprated> in conf/zeppelin-env.sh
>>>>>>>>>>>
>>>>>>>>>>> 2. include the statement spark.jars  <all the jar files with
>>>>>>>>>>> comma separated> in <spark>?conf/spark-defaults.conf
>>>>>>>>>>>
>>>>>>>>>>> 3. use the z.load("the location of jar file in the local
>>>>>>>>>>> filesystem") in zepelin notebook
>>>>>>>>>>>
>>>>>>>>>>> I could test the first two and they both works fine. The third
>>>>>>>>>>> one does not work. Here is the snippet i use
>>>>>>>>>>>
>>>>>>>>>>> %dep
>>>>>>>>>>> z.reset()
>>>>>>>>>>>
>>>>>>>>>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> Further, the import of class belongs to the above jar file is
>>>>>>>>>>> working when I use the statement import com.....  in zeppelin notebook.
>>>>>>>>>>> However, I get the class not found exception in the executor for the same
>>>>>>>>>>> class.
>>>>>>>>>>>
>>>>>>>>>>> Any clue here would help greatly
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>> Bala
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>>
>>>>>> *Regards,*
>>>>>> Wail Alkowaileet
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>
>
> --
> “Every time you stay out late; every time you sleep in; every time you
> miss a workout; every time you don’t give 100% – You make it that much
> easier for me to beat you.” - Unknown author
>

Re: Providing third party jar files to spark

Posted by Felipe Almeida <fa...@gmail.com>.
Hi Balachandar I think I just went through that very same problem and I
solved it with the help of *Moon Soo Lee:*

Here is the solution:
http://stackoverflow.com/questions/35005455/java-npe-when-loading-a-dependency-from-maven-from-within-zeppelin-on-aws-emr

On 25 January 2016 at 14:27, Hyung Sung Shim <hs...@nflabs.com> wrote:

> Hello.
> I also don't work the loadAndDist() method. Maybe it's deprecated.
> I'll check and fix the documents.
>
> using spark-shell, you can run your application as following step.
> 1. remove specified jar configuration in the spark-defaults.conf
> 2. in the spark-home, bin/spark-shell* --jars "YOUR JAR COMMA SPERATED" *
>
> If you share your application code and environments informations(zeppelin
> and spark version you're using, and zeppelin-env.sh etc..), i might help
> you.
>
> Thanks.
>
> <https://gitlab.com/search?group_id=&project_id=769187&scope=issues&search=spark-shell#2-function-define>
>
>
>
>
> 2016-01-25 19:08 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>
>> Hello,
>>
>> I tried to use z.loadAndDist() but it says
>>
>> console>:17: error: value loadAndDist is not a member of
>> org.apache.zeppelin.spark.dep.DependencyContext
>>
>> Any idea here what this method is for?
>>
>>
>> regards
>> Bala
>>
>> On 25 January 2016 at 15:34, Balachandar R.A. <ba...@gmail.com>
>> wrote:
>>
>>> Hello,
>>>
>>> I have run the code in spark-shell successfully but the jar files were
>>> all specified in the config files (spark-defaults.conf). However, I will
>>> not be able to use z.load() in spark-shell. Isn't? I am sorry but I did not
>>> pick up the idea of running using spark-shell. Wail suggestion is to create
>>> a fatJar? I will give it a try but still how do i make sure this fatJar is
>>> accessible to spark executors? ANyway, I will keep you posted on this
>>>
>>> regards
>>> Bala
>>>
>>> On 25 January 2016 at 13:39, Hyung Sung Shim <hs...@nflabs.com> wrote:
>>>
>>>> Hello.
>>>> I think Wail Alkowaileet's comment is possible.
>>>> Balachandar, Could you try to run your application with spark-shell?
>>>>
>>>>
>>>> 2016-01-25 15:45 GMT+09:00 Wail Alkowaileet <wa...@gmail.com>:
>>>>
>>>>> I used z.load in my case and it seems to be working just fine.
>>>>> Can you try spark-shell with your jar file? and see what is the error?
>>>>>
>>>>> I assume the problem that your application requires third-party jars.
>>>>> Therefore, you need to build your app with 'assembly'.
>>>>>
>>>>>
>>>>> On Mon, Jan 25, 2016 at 9:39 AM, Balachandar R.A. <
>>>>> balachandar.ra@gmail.com> wrote:
>>>>>
>>>>>> Hello Hyung,
>>>>>>
>>>>>> There is nothig I could make out from error log as it is plain
>>>>>> straightforward that classNotFoundException
>>>>>>
>>>>>> On 25 January 2016 at 11:34, Hyung Sung Shim <hs...@nflabs.com>
>>>>>> wrote:
>>>>>>
>>>>>>> It's weird..so Could you send the error log for details?
>>>>>>>
>>>>>>> 2016-01-25 15:00 GMT+09:00 Balachandar R.A. <
>>>>>>> balachandar.ra@gmail.com>:
>>>>>>>
>>>>>>>> Hi Hyung,
>>>>>>>>
>>>>>>>> Thanks for the response. This I have tried but did not work.
>>>>>>>>
>>>>>>>> regards
>>>>>>>> Bala
>>>>>>>>
>>>>>>>> On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Hello. Balachandar.
>>>>>>>>> In case of third one that you've tried, It must be first executed
>>>>>>>>> in the notebook.
>>>>>>>>> Could you try restart the zeppelin and run first the "%dep
>>>>>>>>> z.load()" paragraph?
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <
>>>>>>>>> balachandar.ra@gmail.com>:
>>>>>>>>>
>>>>>>>>>> Hi
>>>>>>>>>>
>>>>>>>>>> Any help would be greatly appreciated :-)
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ---------- Forwarded message ----------
>>>>>>>>>> From: Balachandar R.A. <ba...@gmail.com>
>>>>>>>>>> Date: 21 January 2016 at 14:11
>>>>>>>>>> Subject: Providing third party jar files to spark
>>>>>>>>>> To: users@zeppelin.incubator.apache.org
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Hello
>>>>>>>>>>
>>>>>>>>>> My spark based map tasks needs to access third party jar files. I
>>>>>>>>>> found below options to submit third party jar files to spark interpreter
>>>>>>>>>>
>>>>>>>>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma
>>>>>>>>>> seprated> in conf/zeppelin-env.sh
>>>>>>>>>>
>>>>>>>>>> 2. include the statement spark.jars  <all the jar files with
>>>>>>>>>> comma separated> in <spark>?conf/spark-defaults.conf
>>>>>>>>>>
>>>>>>>>>> 3. use the z.load("the location of jar file in the local
>>>>>>>>>> filesystem") in zepelin notebook
>>>>>>>>>>
>>>>>>>>>> I could test the first two and they both works fine. The third
>>>>>>>>>> one does not work. Here is the snippet i use
>>>>>>>>>>
>>>>>>>>>> %dep
>>>>>>>>>> z.reset()
>>>>>>>>>>
>>>>>>>>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Further, the import of class belongs to the above jar file is
>>>>>>>>>> working when I use the statement import com.....  in zeppelin notebook.
>>>>>>>>>> However, I get the class not found exception in the executor for the same
>>>>>>>>>> class.
>>>>>>>>>>
>>>>>>>>>> Any clue here would help greatly
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>> Bala
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>> *Regards,*
>>>>> Wail Alkowaileet
>>>>>
>>>>
>>>>
>>>
>>
>


-- 
“Every time you stay out late; every time you sleep in; every time you miss
a workout; every time you don’t give 100% – You make it that much easier
for me to beat you.” - Unknown author

Re: Providing third party jar files to spark

Posted by "Balachandar R.A." <ba...@gmail.com>.
Hi

I believe this trick will work because I ran it through spark-submit using
--jars option. Will give it a try though

Bala
On 25-Jan-2016 9:57 pm, "Hyung Sung Shim" <hs...@nflabs.com> wrote:

> Hello.
> I also don't work the loadAndDist() method. Maybe it's deprecated.
> I'll check and fix the documents.
>
> using spark-shell, you can run your application as following step.
> 1. remove specified jar configuration in the spark-defaults.conf
> 2. in the spark-home, bin/spark-shell* --jars "YOUR JAR COMMA SPERATED" *
>
> If you share your application code and environments informations(zeppelin
> and spark version you're using, and zeppelin-env.sh etc..), i might help
> you.
>
> Thanks.
>
> <https://gitlab.com/search?group_id=&project_id=769187&scope=issues&search=spark-shell#2-function-define>
>
>
>
>
> 2016-01-25 19:08 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>
>> Hello,
>>
>> I tried to use z.loadAndDist() but it says
>>
>> console>:17: error: value loadAndDist is not a member of
>> org.apache.zeppelin.spark.dep.DependencyContext
>>
>> Any idea here what this method is for?
>>
>>
>> regards
>> Bala
>>
>> On 25 January 2016 at 15:34, Balachandar R.A. <ba...@gmail.com>
>> wrote:
>>
>>> Hello,
>>>
>>> I have run the code in spark-shell successfully but the jar files were
>>> all specified in the config files (spark-defaults.conf). However, I will
>>> not be able to use z.load() in spark-shell. Isn't? I am sorry but I did not
>>> pick up the idea of running using spark-shell. Wail suggestion is to create
>>> a fatJar? I will give it a try but still how do i make sure this fatJar is
>>> accessible to spark executors? ANyway, I will keep you posted on this
>>>
>>> regards
>>> Bala
>>>
>>> On 25 January 2016 at 13:39, Hyung Sung Shim <hs...@nflabs.com> wrote:
>>>
>>>> Hello.
>>>> I think Wail Alkowaileet's comment is possible.
>>>> Balachandar, Could you try to run your application with spark-shell?
>>>>
>>>>
>>>> 2016-01-25 15:45 GMT+09:00 Wail Alkowaileet <wa...@gmail.com>:
>>>>
>>>>> I used z.load in my case and it seems to be working just fine.
>>>>> Can you try spark-shell with your jar file? and see what is the error?
>>>>>
>>>>> I assume the problem that your application requires third-party jars.
>>>>> Therefore, you need to build your app with 'assembly'.
>>>>>
>>>>>
>>>>> On Mon, Jan 25, 2016 at 9:39 AM, Balachandar R.A. <
>>>>> balachandar.ra@gmail.com> wrote:
>>>>>
>>>>>> Hello Hyung,
>>>>>>
>>>>>> There is nothig I could make out from error log as it is plain
>>>>>> straightforward that classNotFoundException
>>>>>>
>>>>>> On 25 January 2016 at 11:34, Hyung Sung Shim <hs...@nflabs.com>
>>>>>> wrote:
>>>>>>
>>>>>>> It's weird..so Could you send the error log for details?
>>>>>>>
>>>>>>> 2016-01-25 15:00 GMT+09:00 Balachandar R.A. <
>>>>>>> balachandar.ra@gmail.com>:
>>>>>>>
>>>>>>>> Hi Hyung,
>>>>>>>>
>>>>>>>> Thanks for the response. This I have tried but did not work.
>>>>>>>>
>>>>>>>> regards
>>>>>>>> Bala
>>>>>>>>
>>>>>>>> On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Hello. Balachandar.
>>>>>>>>> In case of third one that you've tried, It must be first executed
>>>>>>>>> in the notebook.
>>>>>>>>> Could you try restart the zeppelin and run first the "%dep
>>>>>>>>> z.load()" paragraph?
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <
>>>>>>>>> balachandar.ra@gmail.com>:
>>>>>>>>>
>>>>>>>>>> Hi
>>>>>>>>>>
>>>>>>>>>> Any help would be greatly appreciated :-)
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ---------- Forwarded message ----------
>>>>>>>>>> From: Balachandar R.A. <ba...@gmail.com>
>>>>>>>>>> Date: 21 January 2016 at 14:11
>>>>>>>>>> Subject: Providing third party jar files to spark
>>>>>>>>>> To: users@zeppelin.incubator.apache.org
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Hello
>>>>>>>>>>
>>>>>>>>>> My spark based map tasks needs to access third party jar files. I
>>>>>>>>>> found below options to submit third party jar files to spark interpreter
>>>>>>>>>>
>>>>>>>>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma
>>>>>>>>>> seprated> in conf/zeppelin-env.sh
>>>>>>>>>>
>>>>>>>>>> 2. include the statement spark.jars  <all the jar files with
>>>>>>>>>> comma separated> in <spark>?conf/spark-defaults.conf
>>>>>>>>>>
>>>>>>>>>> 3. use the z.load("the location of jar file in the local
>>>>>>>>>> filesystem") in zepelin notebook
>>>>>>>>>>
>>>>>>>>>> I could test the first two and they both works fine. The third
>>>>>>>>>> one does not work. Here is the snippet i use
>>>>>>>>>>
>>>>>>>>>> %dep
>>>>>>>>>> z.reset()
>>>>>>>>>>
>>>>>>>>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Further, the import of class belongs to the above jar file is
>>>>>>>>>> working when I use the statement import com.....  in zeppelin notebook.
>>>>>>>>>> However, I get the class not found exception in the executor for the same
>>>>>>>>>> class.
>>>>>>>>>>
>>>>>>>>>> Any clue here would help greatly
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>> Bala
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>> *Regards,*
>>>>> Wail Alkowaileet
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Providing third party jar files to spark

Posted by Hyung Sung Shim <hs...@nflabs.com>.
Hello.
I also don't work the loadAndDist() method. Maybe it's deprecated.
I'll check and fix the documents.

using spark-shell, you can run your application as following step.
1. remove specified jar configuration in the spark-defaults.conf
2. in the spark-home, bin/spark-shell* --jars "YOUR JAR COMMA SPERATED" *

If you share your application code and environments informations(zeppelin
and spark version you're using, and zeppelin-env.sh etc..), i might help
you.

Thanks.
<https://gitlab.com/search?group_id=&project_id=769187&scope=issues&search=spark-shell#2-function-define>




2016-01-25 19:08 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:

> Hello,
>
> I tried to use z.loadAndDist() but it says
>
> console>:17: error: value loadAndDist is not a member of
> org.apache.zeppelin.spark.dep.DependencyContext
>
> Any idea here what this method is for?
>
>
> regards
> Bala
>
> On 25 January 2016 at 15:34, Balachandar R.A. <ba...@gmail.com>
> wrote:
>
>> Hello,
>>
>> I have run the code in spark-shell successfully but the jar files were
>> all specified in the config files (spark-defaults.conf). However, I will
>> not be able to use z.load() in spark-shell. Isn't? I am sorry but I did not
>> pick up the idea of running using spark-shell. Wail suggestion is to create
>> a fatJar? I will give it a try but still how do i make sure this fatJar is
>> accessible to spark executors? ANyway, I will keep you posted on this
>>
>> regards
>> Bala
>>
>> On 25 January 2016 at 13:39, Hyung Sung Shim <hs...@nflabs.com> wrote:
>>
>>> Hello.
>>> I think Wail Alkowaileet's comment is possible.
>>> Balachandar, Could you try to run your application with spark-shell?
>>>
>>>
>>> 2016-01-25 15:45 GMT+09:00 Wail Alkowaileet <wa...@gmail.com>:
>>>
>>>> I used z.load in my case and it seems to be working just fine.
>>>> Can you try spark-shell with your jar file? and see what is the error?
>>>>
>>>> I assume the problem that your application requires third-party jars.
>>>> Therefore, you need to build your app with 'assembly'.
>>>>
>>>>
>>>> On Mon, Jan 25, 2016 at 9:39 AM, Balachandar R.A. <
>>>> balachandar.ra@gmail.com> wrote:
>>>>
>>>>> Hello Hyung,
>>>>>
>>>>> There is nothig I could make out from error log as it is plain
>>>>> straightforward that classNotFoundException
>>>>>
>>>>> On 25 January 2016 at 11:34, Hyung Sung Shim <hs...@nflabs.com>
>>>>> wrote:
>>>>>
>>>>>> It's weird..so Could you send the error log for details?
>>>>>>
>>>>>> 2016-01-25 15:00 GMT+09:00 Balachandar R.A. <balachandar.ra@gmail.com
>>>>>> >:
>>>>>>
>>>>>>> Hi Hyung,
>>>>>>>
>>>>>>> Thanks for the response. This I have tried but did not work.
>>>>>>>
>>>>>>> regards
>>>>>>> Bala
>>>>>>>
>>>>>>> On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hello. Balachandar.
>>>>>>>> In case of third one that you've tried, It must be first executed
>>>>>>>> in the notebook.
>>>>>>>> Could you try restart the zeppelin and run first the "%dep
>>>>>>>> z.load()" paragraph?
>>>>>>>>
>>>>>>>>
>>>>>>>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <
>>>>>>>> balachandar.ra@gmail.com>:
>>>>>>>>
>>>>>>>>> Hi
>>>>>>>>>
>>>>>>>>> Any help would be greatly appreciated :-)
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ---------- Forwarded message ----------
>>>>>>>>> From: Balachandar R.A. <ba...@gmail.com>
>>>>>>>>> Date: 21 January 2016 at 14:11
>>>>>>>>> Subject: Providing third party jar files to spark
>>>>>>>>> To: users@zeppelin.incubator.apache.org
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Hello
>>>>>>>>>
>>>>>>>>> My spark based map tasks needs to access third party jar files. I
>>>>>>>>> found below options to submit third party jar files to spark interpreter
>>>>>>>>>
>>>>>>>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma
>>>>>>>>> seprated> in conf/zeppelin-env.sh
>>>>>>>>>
>>>>>>>>> 2. include the statement spark.jars  <all the jar files with comma
>>>>>>>>> separated> in <spark>?conf/spark-defaults.conf
>>>>>>>>>
>>>>>>>>> 3. use the z.load("the location of jar file in the local
>>>>>>>>> filesystem") in zepelin notebook
>>>>>>>>>
>>>>>>>>> I could test the first two and they both works fine. The third one
>>>>>>>>> does not work. Here is the snippet i use
>>>>>>>>>
>>>>>>>>> %dep
>>>>>>>>> z.reset()
>>>>>>>>>
>>>>>>>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Further, the import of class belongs to the above jar file is
>>>>>>>>> working when I use the statement import com.....  in zeppelin notebook.
>>>>>>>>> However, I get the class not found exception in the executor for the same
>>>>>>>>> class.
>>>>>>>>>
>>>>>>>>> Any clue here would help greatly
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>> Bala
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>>
>>>> *Regards,*
>>>> Wail Alkowaileet
>>>>
>>>
>>>
>>
>

Re: Providing third party jar files to spark

Posted by "Balachandar R.A." <ba...@gmail.com>.
Hello,

I tried to use z.loadAndDist() but it says

console>:17: error: value loadAndDist is not a member of
org.apache.zeppelin.spark.dep.DependencyContext

Any idea here what this method is for?


regards
Bala

On 25 January 2016 at 15:34, Balachandar R.A. <ba...@gmail.com>
wrote:

> Hello,
>
> I have run the code in spark-shell successfully but the jar files were all
> specified in the config files (spark-defaults.conf). However, I will not be
> able to use z.load() in spark-shell. Isn't? I am sorry but I did not pick
> up the idea of running using spark-shell. Wail suggestion is to create a
> fatJar? I will give it a try but still how do i make sure this fatJar is
> accessible to spark executors? ANyway, I will keep you posted on this
>
> regards
> Bala
>
> On 25 January 2016 at 13:39, Hyung Sung Shim <hs...@nflabs.com> wrote:
>
>> Hello.
>> I think Wail Alkowaileet's comment is possible.
>> Balachandar, Could you try to run your application with spark-shell?
>>
>>
>> 2016-01-25 15:45 GMT+09:00 Wail Alkowaileet <wa...@gmail.com>:
>>
>>> I used z.load in my case and it seems to be working just fine.
>>> Can you try spark-shell with your jar file? and see what is the error?
>>>
>>> I assume the problem that your application requires third-party jars.
>>> Therefore, you need to build your app with 'assembly'.
>>>
>>>
>>> On Mon, Jan 25, 2016 at 9:39 AM, Balachandar R.A. <
>>> balachandar.ra@gmail.com> wrote:
>>>
>>>> Hello Hyung,
>>>>
>>>> There is nothig I could make out from error log as it is plain
>>>> straightforward that classNotFoundException
>>>>
>>>> On 25 January 2016 at 11:34, Hyung Sung Shim <hs...@nflabs.com> wrote:
>>>>
>>>>> It's weird..so Could you send the error log for details?
>>>>>
>>>>> 2016-01-25 15:00 GMT+09:00 Balachandar R.A. <ba...@gmail.com>
>>>>> :
>>>>>
>>>>>> Hi Hyung,
>>>>>>
>>>>>> Thanks for the response. This I have tried but did not work.
>>>>>>
>>>>>> regards
>>>>>> Bala
>>>>>>
>>>>>> On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hello. Balachandar.
>>>>>>> In case of third one that you've tried, It must be first executed in
>>>>>>> the notebook.
>>>>>>> Could you try restart the zeppelin and run first the "%dep z.load()"
>>>>>>> paragraph?
>>>>>>>
>>>>>>>
>>>>>>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <
>>>>>>> balachandar.ra@gmail.com>:
>>>>>>>
>>>>>>>> Hi
>>>>>>>>
>>>>>>>> Any help would be greatly appreciated :-)
>>>>>>>>
>>>>>>>>
>>>>>>>> ---------- Forwarded message ----------
>>>>>>>> From: Balachandar R.A. <ba...@gmail.com>
>>>>>>>> Date: 21 January 2016 at 14:11
>>>>>>>> Subject: Providing third party jar files to spark
>>>>>>>> To: users@zeppelin.incubator.apache.org
>>>>>>>>
>>>>>>>>
>>>>>>>> Hello
>>>>>>>>
>>>>>>>> My spark based map tasks needs to access third party jar files. I
>>>>>>>> found below options to submit third party jar files to spark interpreter
>>>>>>>>
>>>>>>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma
>>>>>>>> seprated> in conf/zeppelin-env.sh
>>>>>>>>
>>>>>>>> 2. include the statement spark.jars  <all the jar files with comma
>>>>>>>> separated> in <spark>?conf/spark-defaults.conf
>>>>>>>>
>>>>>>>> 3. use the z.load("the location of jar file in the local
>>>>>>>> filesystem") in zepelin notebook
>>>>>>>>
>>>>>>>> I could test the first two and they both works fine. The third one
>>>>>>>> does not work. Here is the snippet i use
>>>>>>>>
>>>>>>>> %dep
>>>>>>>> z.reset()
>>>>>>>>
>>>>>>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>>>>>>
>>>>>>>>
>>>>>>>> Further, the import of class belongs to the above jar file is
>>>>>>>> working when I use the statement import com.....  in zeppelin notebook.
>>>>>>>> However, I get the class not found exception in the executor for the same
>>>>>>>> class.
>>>>>>>>
>>>>>>>> Any clue here would help greatly
>>>>>>>>
>>>>>>>>
>>>>>>>> regards
>>>>>>>> Bala
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>>
>>> *Regards,*
>>> Wail Alkowaileet
>>>
>>
>>
>

Re: Providing third party jar files to spark

Posted by "Balachandar R.A." <ba...@gmail.com>.
Hello,

I have run the code in spark-shell successfully but the jar files were all
specified in the config files (spark-defaults.conf). However, I will not be
able to use z.load() in spark-shell. Isn't? I am sorry but I did not pick
up the idea of running using spark-shell. Wail suggestion is to create a
fatJar? I will give it a try but still how do i make sure this fatJar is
accessible to spark executors? ANyway, I will keep you posted on this

regards
Bala

On 25 January 2016 at 13:39, Hyung Sung Shim <hs...@nflabs.com> wrote:

> Hello.
> I think Wail Alkowaileet's comment is possible.
> Balachandar, Could you try to run your application with spark-shell?
>
>
> 2016-01-25 15:45 GMT+09:00 Wail Alkowaileet <wa...@gmail.com>:
>
>> I used z.load in my case and it seems to be working just fine.
>> Can you try spark-shell with your jar file? and see what is the error?
>>
>> I assume the problem that your application requires third-party jars.
>> Therefore, you need to build your app with 'assembly'.
>>
>>
>> On Mon, Jan 25, 2016 at 9:39 AM, Balachandar R.A. <
>> balachandar.ra@gmail.com> wrote:
>>
>>> Hello Hyung,
>>>
>>> There is nothig I could make out from error log as it is plain
>>> straightforward that classNotFoundException
>>>
>>> On 25 January 2016 at 11:34, Hyung Sung Shim <hs...@nflabs.com> wrote:
>>>
>>>> It's weird..so Could you send the error log for details?
>>>>
>>>> 2016-01-25 15:00 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>>>>
>>>>> Hi Hyung,
>>>>>
>>>>> Thanks for the response. This I have tried but did not work.
>>>>>
>>>>> regards
>>>>> Bala
>>>>>
>>>>> On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com>
>>>>> wrote:
>>>>>
>>>>>> Hello. Balachandar.
>>>>>> In case of third one that you've tried, It must be first executed in
>>>>>> the notebook.
>>>>>> Could you try restart the zeppelin and run first the "%dep z.load()"
>>>>>> paragraph?
>>>>>>
>>>>>>
>>>>>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <balachandar.ra@gmail.com
>>>>>> >:
>>>>>>
>>>>>>> Hi
>>>>>>>
>>>>>>> Any help would be greatly appreciated :-)
>>>>>>>
>>>>>>>
>>>>>>> ---------- Forwarded message ----------
>>>>>>> From: Balachandar R.A. <ba...@gmail.com>
>>>>>>> Date: 21 January 2016 at 14:11
>>>>>>> Subject: Providing third party jar files to spark
>>>>>>> To: users@zeppelin.incubator.apache.org
>>>>>>>
>>>>>>>
>>>>>>> Hello
>>>>>>>
>>>>>>> My spark based map tasks needs to access third party jar files. I
>>>>>>> found below options to submit third party jar files to spark interpreter
>>>>>>>
>>>>>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma
>>>>>>> seprated> in conf/zeppelin-env.sh
>>>>>>>
>>>>>>> 2. include the statement spark.jars  <all the jar files with comma
>>>>>>> separated> in <spark>?conf/spark-defaults.conf
>>>>>>>
>>>>>>> 3. use the z.load("the location of jar file in the local
>>>>>>> filesystem") in zepelin notebook
>>>>>>>
>>>>>>> I could test the first two and they both works fine. The third one
>>>>>>> does not work. Here is the snippet i use
>>>>>>>
>>>>>>> %dep
>>>>>>> z.reset()
>>>>>>>
>>>>>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>>>>>
>>>>>>>
>>>>>>> Further, the import of class belongs to the above jar file is
>>>>>>> working when I use the statement import com.....  in zeppelin notebook.
>>>>>>> However, I get the class not found exception in the executor for the same
>>>>>>> class.
>>>>>>>
>>>>>>> Any clue here would help greatly
>>>>>>>
>>>>>>>
>>>>>>> regards
>>>>>>> Bala
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>>
>> *Regards,*
>> Wail Alkowaileet
>>
>
>

Re: Providing third party jar files to spark

Posted by Hyung Sung Shim <hs...@nflabs.com>.
Hello.
I think Wail Alkowaileet's comment is possible.
Balachandar, Could you try to run your application with spark-shell?


2016-01-25 15:45 GMT+09:00 Wail Alkowaileet <wa...@gmail.com>:

> I used z.load in my case and it seems to be working just fine.
> Can you try spark-shell with your jar file? and see what is the error?
>
> I assume the problem that your application requires third-party jars.
> Therefore, you need to build your app with 'assembly'.
>
>
> On Mon, Jan 25, 2016 at 9:39 AM, Balachandar R.A. <
> balachandar.ra@gmail.com> wrote:
>
>> Hello Hyung,
>>
>> There is nothig I could make out from error log as it is plain
>> straightforward that classNotFoundException
>>
>> On 25 January 2016 at 11:34, Hyung Sung Shim <hs...@nflabs.com> wrote:
>>
>>> It's weird..so Could you send the error log for details?
>>>
>>> 2016-01-25 15:00 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>>>
>>>> Hi Hyung,
>>>>
>>>> Thanks for the response. This I have tried but did not work.
>>>>
>>>> regards
>>>> Bala
>>>>
>>>> On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com> wrote:
>>>>
>>>>> Hello. Balachandar.
>>>>> In case of third one that you've tried, It must be first executed in
>>>>> the notebook.
>>>>> Could you try restart the zeppelin and run first the "%dep z.load()"
>>>>> paragraph?
>>>>>
>>>>>
>>>>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <ba...@gmail.com>
>>>>> :
>>>>>
>>>>>> Hi
>>>>>>
>>>>>> Any help would be greatly appreciated :-)
>>>>>>
>>>>>>
>>>>>> ---------- Forwarded message ----------
>>>>>> From: Balachandar R.A. <ba...@gmail.com>
>>>>>> Date: 21 January 2016 at 14:11
>>>>>> Subject: Providing third party jar files to spark
>>>>>> To: users@zeppelin.incubator.apache.org
>>>>>>
>>>>>>
>>>>>> Hello
>>>>>>
>>>>>> My spark based map tasks needs to access third party jar files. I
>>>>>> found below options to submit third party jar files to spark interpreter
>>>>>>
>>>>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma
>>>>>> seprated> in conf/zeppelin-env.sh
>>>>>>
>>>>>> 2. include the statement spark.jars  <all the jar files with comma
>>>>>> separated> in <spark>?conf/spark-defaults.conf
>>>>>>
>>>>>> 3. use the z.load("the location of jar file in the local filesystem")
>>>>>> in zepelin notebook
>>>>>>
>>>>>> I could test the first two and they both works fine. The third one
>>>>>> does not work. Here is the snippet i use
>>>>>>
>>>>>> %dep
>>>>>> z.reset()
>>>>>>
>>>>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>>>>
>>>>>>
>>>>>> Further, the import of class belongs to the above jar file is working
>>>>>> when I use the statement import com.....  in zeppelin notebook. However, I
>>>>>> get the class not found exception in the executor for the same class.
>>>>>>
>>>>>> Any clue here would help greatly
>>>>>>
>>>>>>
>>>>>> regards
>>>>>> Bala
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>
>
> --
>
> *Regards,*
> Wail Alkowaileet
>

Re: Providing third party jar files to spark

Posted by Wail Alkowaileet <wa...@gmail.com>.
I used z.load in my case and it seems to be working just fine.
Can you try spark-shell with your jar file? and see what is the error?

I assume the problem that your application requires third-party jars.
Therefore, you need to build your app with 'assembly'.


On Mon, Jan 25, 2016 at 9:39 AM, Balachandar R.A. <ba...@gmail.com>
wrote:

> Hello Hyung,
>
> There is nothig I could make out from error log as it is plain
> straightforward that classNotFoundException
>
> On 25 January 2016 at 11:34, Hyung Sung Shim <hs...@nflabs.com> wrote:
>
>> It's weird..so Could you send the error log for details?
>>
>> 2016-01-25 15:00 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>>
>>> Hi Hyung,
>>>
>>> Thanks for the response. This I have tried but did not work.
>>>
>>> regards
>>> Bala
>>>
>>> On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com> wrote:
>>>
>>>> Hello. Balachandar.
>>>> In case of third one that you've tried, It must be first executed in
>>>> the notebook.
>>>> Could you try restart the zeppelin and run first the "%dep z.load()"
>>>> paragraph?
>>>>
>>>>
>>>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>>>>
>>>>> Hi
>>>>>
>>>>> Any help would be greatly appreciated :-)
>>>>>
>>>>>
>>>>> ---------- Forwarded message ----------
>>>>> From: Balachandar R.A. <ba...@gmail.com>
>>>>> Date: 21 January 2016 at 14:11
>>>>> Subject: Providing third party jar files to spark
>>>>> To: users@zeppelin.incubator.apache.org
>>>>>
>>>>>
>>>>> Hello
>>>>>
>>>>> My spark based map tasks needs to access third party jar files. I
>>>>> found below options to submit third party jar files to spark interpreter
>>>>>
>>>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma seprated>
>>>>> in conf/zeppelin-env.sh
>>>>>
>>>>> 2. include the statement spark.jars  <all the jar files with comma
>>>>> separated> in <spark>?conf/spark-defaults.conf
>>>>>
>>>>> 3. use the z.load("the location of jar file in the local filesystem")
>>>>> in zepelin notebook
>>>>>
>>>>> I could test the first two and they both works fine. The third one
>>>>> does not work. Here is the snippet i use
>>>>>
>>>>> %dep
>>>>> z.reset()
>>>>>
>>>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>>>
>>>>>
>>>>> Further, the import of class belongs to the above jar file is working
>>>>> when I use the statement import com.....  in zeppelin notebook. However, I
>>>>> get the class not found exception in the executor for the same class.
>>>>>
>>>>> Any clue here would help greatly
>>>>>
>>>>>
>>>>> regards
>>>>> Bala
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>


-- 

*Regards,*
Wail Alkowaileet

Re: Providing third party jar files to spark

Posted by "Balachandar R.A." <ba...@gmail.com>.
Hello Hyung,

There is nothig I could make out from error log as it is plain
straightforward that classNotFoundException

On 25 January 2016 at 11:34, Hyung Sung Shim <hs...@nflabs.com> wrote:

> It's weird..so Could you send the error log for details?
>
> 2016-01-25 15:00 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>
>> Hi Hyung,
>>
>> Thanks for the response. This I have tried but did not work.
>>
>> regards
>> Bala
>>
>> On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com> wrote:
>>
>>> Hello. Balachandar.
>>> In case of third one that you've tried, It must be first executed in the
>>> notebook.
>>> Could you try restart the zeppelin and run first the "%dep z.load()"
>>> paragraph?
>>>
>>>
>>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>>>
>>>> Hi
>>>>
>>>> Any help would be greatly appreciated :-)
>>>>
>>>>
>>>> ---------- Forwarded message ----------
>>>> From: Balachandar R.A. <ba...@gmail.com>
>>>> Date: 21 January 2016 at 14:11
>>>> Subject: Providing third party jar files to spark
>>>> To: users@zeppelin.incubator.apache.org
>>>>
>>>>
>>>> Hello
>>>>
>>>> My spark based map tasks needs to access third party jar files. I found
>>>> below options to submit third party jar files to spark interpreter
>>>>
>>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma seprated>
>>>> in conf/zeppelin-env.sh
>>>>
>>>> 2. include the statement spark.jars  <all the jar files with comma
>>>> separated> in <spark>?conf/spark-defaults.conf
>>>>
>>>> 3. use the z.load("the location of jar file in the local filesystem")
>>>> in zepelin notebook
>>>>
>>>> I could test the first two and they both works fine. The third one does
>>>> not work. Here is the snippet i use
>>>>
>>>> %dep
>>>> z.reset()
>>>>
>>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>>
>>>>
>>>> Further, the import of class belongs to the above jar file is working
>>>> when I use the statement import com.....  in zeppelin notebook. However, I
>>>> get the class not found exception in the executor for the same class.
>>>>
>>>> Any clue here would help greatly
>>>>
>>>>
>>>> regards
>>>> Bala
>>>>
>>>>
>>>>
>>>>
>>>
>>
>

Re: Providing third party jar files to spark

Posted by Hyung Sung Shim <hs...@nflabs.com>.
It's weird..so Could you send the error log for details?

2016-01-25 15:00 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:

> Hi Hyung,
>
> Thanks for the response. This I have tried but did not work.
>
> regards
> Bala
>
> On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com> wrote:
>
>> Hello. Balachandar.
>> In case of third one that you've tried, It must be first executed in the
>> notebook.
>> Could you try restart the zeppelin and run first the "%dep z.load()"
>> paragraph?
>>
>>
>> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>>
>>> Hi
>>>
>>> Any help would be greatly appreciated :-)
>>>
>>>
>>> ---------- Forwarded message ----------
>>> From: Balachandar R.A. <ba...@gmail.com>
>>> Date: 21 January 2016 at 14:11
>>> Subject: Providing third party jar files to spark
>>> To: users@zeppelin.incubator.apache.org
>>>
>>>
>>> Hello
>>>
>>> My spark based map tasks needs to access third party jar files. I found
>>> below options to submit third party jar files to spark interpreter
>>>
>>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma seprated>
>>> in conf/zeppelin-env.sh
>>>
>>> 2. include the statement spark.jars  <all the jar files with comma
>>> separated> in <spark>?conf/spark-defaults.conf
>>>
>>> 3. use the z.load("the location of jar file in the local filesystem") in
>>> zepelin notebook
>>>
>>> I could test the first two and they both works fine. The third one does
>>> not work. Here is the snippet i use
>>>
>>> %dep
>>> z.reset()
>>>
>>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>>
>>>
>>> Further, the import of class belongs to the above jar file is working
>>> when I use the statement import com.....  in zeppelin notebook. However, I
>>> get the class not found exception in the executor for the same class.
>>>
>>> Any clue here would help greatly
>>>
>>>
>>> regards
>>> Bala
>>>
>>>
>>>
>>>
>>
>

Re: Providing third party jar files to spark

Posted by "Balachandar R.A." <ba...@gmail.com>.
Hi Hyung,

Thanks for the response. This I have tried but did not work.

regards
Bala

On 25 January 2016 at 11:27, Hyung Sung Shim <hs...@nflabs.com> wrote:

> Hello. Balachandar.
> In case of third one that you've tried, It must be first executed in the
> notebook.
> Could you try restart the zeppelin and run first the "%dep z.load()"
> paragraph?
>
>
> 2016-01-25 14:39 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:
>
>> Hi
>>
>> Any help would be greatly appreciated :-)
>>
>>
>> ---------- Forwarded message ----------
>> From: Balachandar R.A. <ba...@gmail.com>
>> Date: 21 January 2016 at 14:11
>> Subject: Providing third party jar files to spark
>> To: users@zeppelin.incubator.apache.org
>>
>>
>> Hello
>>
>> My spark based map tasks needs to access third party jar files. I found
>> below options to submit third party jar files to spark interpreter
>>
>> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma seprated> in
>> conf/zeppelin-env.sh
>>
>> 2. include the statement spark.jars  <all the jar files with comma
>> separated> in <spark>?conf/spark-defaults.conf
>>
>> 3. use the z.load("the location of jar file in the local filesystem") in
>> zepelin notebook
>>
>> I could test the first two and they both works fine. The third one does
>> not work. Here is the snippet i use
>>
>> %dep
>> z.reset()
>>
>> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>>
>>
>> Further, the import of class belongs to the above jar file is working
>> when I use the statement import com.....  in zeppelin notebook. However, I
>> get the class not found exception in the executor for the same class.
>>
>> Any clue here would help greatly
>>
>>
>> regards
>> Bala
>>
>>
>>
>>
>

Re: Providing third party jar files to spark

Posted by Hyung Sung Shim <hs...@nflabs.com>.
Hello. Balachandar.
In case of third one that you've tried, It must be first executed in the
notebook.
Could you try restart the zeppelin and run first the "%dep z.load()"
paragraph?


2016-01-25 14:39 GMT+09:00 Balachandar R.A. <ba...@gmail.com>:

> Hi
>
> Any help would be greatly appreciated :-)
>
>
> ---------- Forwarded message ----------
> From: Balachandar R.A. <ba...@gmail.com>
> Date: 21 January 2016 at 14:11
> Subject: Providing third party jar files to spark
> To: users@zeppelin.incubator.apache.org
>
>
> Hello
>
> My spark based map tasks needs to access third party jar files. I found
> below options to submit third party jar files to spark interpreter
>
> 1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma seprated> in
> conf/zeppelin-env.sh
>
> 2. include the statement spark.jars  <all the jar files with comma
> separated> in <spark>?conf/spark-defaults.conf
>
> 3. use the z.load("the location of jar file in the local filesystem") in
> zepelin notebook
>
> I could test the first two and they both works fine. The third one does
> not work. Here is the snippet i use
>
> %dep
> z.reset()
>
> z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")
>
>
> Further, the import of class belongs to the above jar file is working when
> I use the statement import com.....  in zeppelin notebook. However, I get
> the class not found exception in the executor for the same class.
>
> Any clue here would help greatly
>
>
> regards
> Bala
>
>
>
>

Fwd: Providing third party jar files to spark

Posted by "Balachandar R.A." <ba...@gmail.com>.
Hi

Any help would be greatly appreciated :-)

---------- Forwarded message ----------
From: Balachandar R.A. <ba...@gmail.com>
Date: 21 January 2016 at 14:11
Subject: Providing third party jar files to spark
To: users@zeppelin.incubator.apache.org


Hello

My spark based map tasks needs to access third party jar files. I found
below options to submit third party jar files to spark interpreter

1. export SPARK_SUBMIT_OPTIONS=<all the jar files with comma seprated> in
conf/zeppelin-env.sh

2. include the statement spark.jars  <all the jar files with comma
separated> in <spark>?conf/spark-defaults.conf

3. use the z.load("the location of jar file in the local filesystem") in
zepelin notebook

I could test the first two and they both works fine. The third one does not
work. Here is the snippet i use

%dep
z.reset()
z.load("file:///home/bala/Projects/pocv8.new/mapreduce/build/libs/mapreduce.jar")


Further, the import of class belongs to the above jar file is working when
I use the statement import com.....  in zeppelin notebook. However, I get
the class not found exception in the executor for the same class.

Any clue here would help greatly


regards
Bala