You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Ulanov, Alexander" <al...@hp.com> on 2014/06/11 19:25:11 UTC

Adding external jar to spark-shell classpath in spark 1.0

Hi,

I am currently using spark 1.0 locally on Windows 7. I would like to use classes from external jar in the spark-shell. I followed the instruction in: http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCALrNVjWWF6k=c_JrhoE9W_qaACJLD4+kbdUhfV0Pitr8H1fXnw@mail.gmail.com%3E

I have set ADD_JARS="my.jar" SPARK_CLASSPATH="my.jar" in spark-shell.cmd but this didn't work.

I also tried running "spark-shell.cmd --jars my.jar --driver-class-path my.jar --driver-library-path my.jar" and it didn't work either.

I cannot load any class from my jar into spark shell. Btw my.jar contains a simple Scala class.

Best regards, Alexander

Re: Adding external jar to spark-shell classpath in spark 1.0

Posted by Shivani Rao <ra...@gmail.com>.
@Marcelo:  The command ./bin/spark-shell --jars jar1,jar2,etc,etc did not
work for me on a linux machine

What I did is to append the class path in the bin/compute-classpath.sh
file. Ran the script, then started the spark shell, and that worked


Thanks
Shivani


On Wed, Jun 11, 2014 at 10:52 AM, Andrew Or <an...@databricks.com> wrote:

> Ah, of course, there are no application jars in spark-shell, then it seems
> that there are no workarounds for this at the moment. We will look into a
> fix shortly, but for now you will have to create an application and use
> spark-submit (or use spark-shell on Linux).
>
>
> 2014-06-11 10:42 GMT-07:00 Ulanov, Alexander <al...@hp.com>:
>
>   Could you elaborate on this? I don’t have an application, I just use
>> spark shell.
>>
>>
>>
>> *From:* Andrew Or [mailto:andrew@databricks.com]
>> *Sent:* Wednesday, June 11, 2014 9:40 PM
>>
>> *To:* user@spark.apache.org
>> *Subject:* Re: Adding external jar to spark-shell classpath in spark 1.0
>>
>>
>>
>> This is a known issue: https://issues.apache.org/jira/browse/SPARK-1919.
>> We haven't found a fix yet, but for now, you can workaround this by
>> including your simple class in your application jar.
>>
>>
>>
>> 2014-06-11 10:25 GMT-07:00 Ulanov, Alexander <al...@hp.com>:
>>
>>  Hi,
>>
>>
>>
>> I am currently using spark 1.0 locally on Windows 7. I would like to use
>> classes from external jar in the spark-shell. I followed the instruction
>> in:
>> http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCALrNVjWWF6k=c_JrhoE9W_qaACJLD4+kbdUhfV0Pitr8H1fXnw@mail.gmail.com%3E
>>
>>
>>
>> I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in spark-shell.cmd
>> but this didn’t work.
>>
>>
>>
>> I also tried running “spark-shell.cmd --jars my.jar --driver-class-path
>> my.jar --driver-library-path my.jar” and it didn’t work either.
>>
>>
>>
>> I cannot load any class from my jar into spark shell. Btw my.jar contains
>> a simple Scala class.
>>
>>
>>
>> Best regards, Alexander
>>
>>
>>
>
>


-- 
Software Engineer
Analytics Engineering Team@ Box
Mountain View, CA

Re: Adding external jar to spark-shell classpath in spark 1.0

Posted by Andrew Or <an...@databricks.com>.
Ah, of course, there are no application jars in spark-shell, then it seems
that there are no workarounds for this at the moment. We will look into a
fix shortly, but for now you will have to create an application and use
spark-submit (or use spark-shell on Linux).


2014-06-11 10:42 GMT-07:00 Ulanov, Alexander <al...@hp.com>:

>  Could you elaborate on this? I don’t have an application, I just use
> spark shell.
>
>
>
> *From:* Andrew Or [mailto:andrew@databricks.com]
> *Sent:* Wednesday, June 11, 2014 9:40 PM
>
> *To:* user@spark.apache.org
> *Subject:* Re: Adding external jar to spark-shell classpath in spark 1.0
>
>
>
> This is a known issue: https://issues.apache.org/jira/browse/SPARK-1919.
> We haven't found a fix yet, but for now, you can workaround this by
> including your simple class in your application jar.
>
>
>
> 2014-06-11 10:25 GMT-07:00 Ulanov, Alexander <al...@hp.com>:
>
>  Hi,
>
>
>
> I am currently using spark 1.0 locally on Windows 7. I would like to use
> classes from external jar in the spark-shell. I followed the instruction
> in:
> http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCALrNVjWWF6k=c_JrhoE9W_qaACJLD4+kbdUhfV0Pitr8H1fXnw@mail.gmail.com%3E
>
>
>
> I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in spark-shell.cmd
> but this didn’t work.
>
>
>
> I also tried running “spark-shell.cmd --jars my.jar --driver-class-path
> my.jar --driver-library-path my.jar” and it didn’t work either.
>
>
>
> I cannot load any class from my jar into spark shell. Btw my.jar contains
> a simple Scala class.
>
>
>
> Best regards, Alexander
>
>
>

RE: Adding external jar to spark-shell classpath in spark 1.0

Posted by "Ulanov, Alexander" <al...@hp.com>.
Could you elaborate on this? I don’t have an application, I just use spark shell.

From: Andrew Or [mailto:andrew@databricks.com]
Sent: Wednesday, June 11, 2014 9:40 PM
To: user@spark.apache.org
Subject: Re: Adding external jar to spark-shell classpath in spark 1.0

This is a known issue: https://issues.apache.org/jira/browse/SPARK-1919. We haven't found a fix yet, but for now, you can workaround this by including your simple class in your application jar.

2014-06-11 10:25 GMT-07:00 Ulanov, Alexander <al...@hp.com>>:
Hi,

I am currently using spark 1.0 locally on Windows 7. I would like to use classes from external jar in the spark-shell. I followed the instruction in: http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCALrNVjWWF6k=c_JrhoE9W_qaACJLD4+kbdUhfV0Pitr8H1fXnw@mail.gmail.com%3E

I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in spark-shell.cmd but this didn’t work.

I also tried running “spark-shell.cmd --jars my.jar --driver-class-path my.jar --driver-library-path my.jar” and it didn’t work either.

I cannot load any class from my jar into spark shell. Btw my.jar contains a simple Scala class.

Best regards, Alexander


Re: Adding external jar to spark-shell classpath in spark 1.0

Posted by Andrew Or <an...@databricks.com>.
This is a known issue: https://issues.apache.org/jira/browse/SPARK-1919. We
haven't found a fix yet, but for now, you can workaround this by including
your simple class in your application jar.


2014-06-11 10:25 GMT-07:00 Ulanov, Alexander <al...@hp.com>:

>  Hi,
>
>
>
> I am currently using spark 1.0 locally on Windows 7. I would like to use
> classes from external jar in the spark-shell. I followed the instruction
> in:
> http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCALrNVjWWF6k=c_JrhoE9W_qaACJLD4+kbdUhfV0Pitr8H1fXnw@mail.gmail.com%3E
>
>
>
> I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in spark-shell.cmd
> but this didn’t work.
>
>
>
> I also tried running “spark-shell.cmd --jars my.jar --driver-class-path
> my.jar --driver-library-path my.jar” and it didn’t work either.
>
>
>
> I cannot load any class from my jar into spark shell. Btw my.jar contains
> a simple Scala class.
>
>
>
> Best regards, Alexander
>

RE: Adding external jar to spark-shell classpath in spark 1.0

Posted by "Ulanov, Alexander" <al...@hp.com>.
Are you able to import any class from you jars within spark-shell?

-----Original Message-----
From: Marcelo Vanzin [mailto:vanzin@cloudera.com] 
Sent: Wednesday, June 11, 2014 9:36 PM
To: user@spark.apache.org
Subject: Re: Adding external jar to spark-shell classpath in spark 1.0

Ah, not that it should matter, but I'm on Linux and you seem to be on Windows... maybe there is something weird going on with the Windows launcher?

On Wed, Jun 11, 2014 at 10:34 AM, Marcelo Vanzin <va...@cloudera.com> wrote:
> Just tried this and it worked fine for me:
>
> ./bin/spark-shell --jars jar1,jar2,etc,etc
>
> On Wed, Jun 11, 2014 at 10:25 AM, Ulanov, Alexander 
> <al...@hp.com> wrote:
>> Hi,
>>
>>
>>
>> I am currently using spark 1.0 locally on Windows 7. I would like to 
>> use classes from external jar in the spark-shell. I followed the instruction in:
>> http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCA
>> LrNVjWWF6k=c_JrhoE9W_qaACJLD4+kbdUhfV0Pitr8H1fXnw@mail.gmail.com%3E
>>
>>
>>
>> I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in 
>> spark-shell.cmd but this didn’t work.
>>
>>
>>
>> I also tried running “spark-shell.cmd --jars my.jar 
>> --driver-class-path my.jar --driver-library-path my.jar” and it didn’t work either.
>>
>>
>>
>> I cannot load any class from my jar into spark shell. Btw my.jar 
>> contains a simple Scala class.
>>
>>
>>
>> Best regards, Alexander
>
>
>
> --
> Marcelo



--
Marcelo

Re: Adding external jar to spark-shell classpath in spark 1.0

Posted by Marcelo Vanzin <va...@cloudera.com>.
Ah, not that it should matter, but I'm on Linux and you seem to be on
Windows... maybe there is something weird going on with the Windows
launcher?

On Wed, Jun 11, 2014 at 10:34 AM, Marcelo Vanzin <va...@cloudera.com> wrote:
> Just tried this and it worked fine for me:
>
> ./bin/spark-shell --jars jar1,jar2,etc,etc
>
> On Wed, Jun 11, 2014 at 10:25 AM, Ulanov, Alexander
> <al...@hp.com> wrote:
>> Hi,
>>
>>
>>
>> I am currently using spark 1.0 locally on Windows 7. I would like to use
>> classes from external jar in the spark-shell. I followed the instruction in:
>> http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCALrNVjWWF6k=c_JrhoE9W_qaACJLD4+kbdUhfV0Pitr8H1fXnw@mail.gmail.com%3E
>>
>>
>>
>> I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in spark-shell.cmd but
>> this didn’t work.
>>
>>
>>
>> I also tried running “spark-shell.cmd --jars my.jar --driver-class-path
>> my.jar --driver-library-path my.jar” and it didn’t work either.
>>
>>
>>
>> I cannot load any class from my jar into spark shell. Btw my.jar contains a
>> simple Scala class.
>>
>>
>>
>> Best regards, Alexander
>
>
>
> --
> Marcelo



-- 
Marcelo

Re: Adding external jar to spark-shell classpath in spark 1.0

Posted by Marcelo Vanzin <va...@cloudera.com>.
Just tried this and it worked fine for me:

./bin/spark-shell --jars jar1,jar2,etc,etc

On Wed, Jun 11, 2014 at 10:25 AM, Ulanov, Alexander
<al...@hp.com> wrote:
> Hi,
>
>
>
> I am currently using spark 1.0 locally on Windows 7. I would like to use
> classes from external jar in the spark-shell. I followed the instruction in:
> http://mail-archives.apache.org/mod_mbox/spark-user/201402.mbox/%3CCALrNVjWWF6k=c_JrhoE9W_qaACJLD4+kbdUhfV0Pitr8H1fXnw@mail.gmail.com%3E
>
>
>
> I have set ADD_JARS=”my.jar” SPARK_CLASSPATH=”my.jar” in spark-shell.cmd but
> this didn’t work.
>
>
>
> I also tried running “spark-shell.cmd --jars my.jar --driver-class-path
> my.jar --driver-library-path my.jar” and it didn’t work either.
>
>
>
> I cannot load any class from my jar into spark shell. Btw my.jar contains a
> simple Scala class.
>
>
>
> Best regards, Alexander



-- 
Marcelo