You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Arkadiusz Komarzewski <ak...@gmail.com> on 2016/10/10 17:16:03 UTC

Problem with local jars in Zeppelin 0.6.1 with Spark 2.0.0

Hi everyone,

I'm using Zeppelin 0.6.1 with Spark 2.0.0 on Yarn (with master configured
as yarn-client). I have couple of jars stored locally which I'd like to use
in Spark interpreter. I've put these jars in `spark-defaults.conf` under
`spark.jars` (as described in https://zeppelin.apache.org/do
cs/0.6.1/interpreter/spark.html#2-loading-spark-properties).

Despite this these jars seem to be unavailable in Zeppelin's Spark
interpreter - when importing any classes from them I'm getting: "error:
object SOMEOBJECT is not a member of package SOMEPACKAGE"

I checked `spark.conf.getAll` (via the interpreter), and there's something
interesting: all my jars are listed under `spark.yarn.secondary.jars` and
`spark.yarn.dist.jars`. So it looks like they're available on the cluster,
but unavailable in the driver program.

Does anyone have any idea what's going on here? Have I missed some config
options?


Regards,
Arkadiusz

Re: Problem with local jars in Zeppelin 0.6.1 with Spark 2.0.0

Posted by Arkadiusz Komarzewski <ak...@gmail.com>.
Thanks a lot Mich!
Somehow I missed that it's possible to put filesystem path in interpreter
config.

Regards,
Arkadiusz

2016-10-10 19:27 GMT+02:00 Mich Talebzadeh <mi...@gmail.com>:

> What are you doing in Spark that causes Zeppelin to throw that error.
>
> Have you loaded Jars to the interpreter as dependencies as well?
>
> [image: Inline images 1]
>
> HTH
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On 10 October 2016 at 18:16, Arkadiusz Komarzewski <ak...@gmail.com>
> wrote:
>
>> Hi everyone,
>>
>> I'm using Zeppelin 0.6.1 with Spark 2.0.0 on Yarn (with master configured
>> as yarn-client). I have couple of jars stored locally which I'd like to
>> use in Spark interpreter. I've put these jars in `spark-defaults.conf`
>> under `spark.jars` (as described in https://zeppelin.apache.org
>> /docs/0.6.1/interpreter/spark.html#2-loading-spark-properties).
>>
>> Despite this these jars seem to be unavailable in Zeppelin's Spark
>> interpreter - when importing any classes from them I'm getting: "error:
>> object SOMEOBJECT is not a member of package SOMEPACKAGE"
>>
>> I checked `spark.conf.getAll` (via the interpreter), and there's
>> something interesting: all my jars are listed under
>> `spark.yarn.secondary.jars` and `spark.yarn.dist.jars`. So it looks like
>> they're available on the cluster, but unavailable in the driver program.
>>
>> Does anyone have any idea what's going on here? Have I missed some config
>> options?
>>
>>
>> Regards,
>> Arkadiusz
>>
>
>

Re: Problem with local jars in Zeppelin 0.6.1 with Spark 2.0.0

Posted by Mich Talebzadeh <mi...@gmail.com>.
What are you doing in Spark that causes Zeppelin to throw that error.

Have you loaded Jars to the interpreter as dependencies as well?

[image: Inline images 1]

HTH

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 10 October 2016 at 18:16, Arkadiusz Komarzewski <ak...@gmail.com>
wrote:

> Hi everyone,
>
> I'm using Zeppelin 0.6.1 with Spark 2.0.0 on Yarn (with master configured
> as yarn-client). I have couple of jars stored locally which I'd like to
> use in Spark interpreter. I've put these jars in `spark-defaults.conf`
> under `spark.jars` (as described in https://zeppelin.apache.
> org/docs/0.6.1/interpreter/spark.html#2-loading-spark-properties).
>
> Despite this these jars seem to be unavailable in Zeppelin's Spark
> interpreter - when importing any classes from them I'm getting: "error:
> object SOMEOBJECT is not a member of package SOMEPACKAGE"
>
> I checked `spark.conf.getAll` (via the interpreter), and there's something
> interesting: all my jars are listed under `spark.yarn.secondary.jars` and
> `spark.yarn.dist.jars`. So it looks like they're available on the cluster,
> but unavailable in the driver program.
>
> Does anyone have any idea what's going on here? Have I missed some config
> options?
>
>
> Regards,
> Arkadiusz
>