You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Soumya Simanta <so...@gmail.com> on 2014/02/04 22:28:21 UTC
Adding external jar to spark-shell classpath using ADD_JARS
Hi,
I've a Spark cluster where I want to use classes from 3rd party jar in my
shell.
I'm starting my spark shell using the following command.
MASTER="spark://n001:7077"
ADD_JARS=/home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar
SPARK_MEM="24G" ./spark-shell
I also see the following in the logs.
14/02/04 16:09:25 INFO SparkContext: Added JAR
/home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar at
http://10.27.112.32:59460/jars/twitter4j-core-3.0.5.jar with timestamp
1391548165483
However, when I try to import one of the classes in that jar file I get the
following error.
scala> import twitter4j.Status
<console>:10: error: not found: value twitter4j
import twitter4j.Status
^
Re: Adding external jar to spark-shell classpath using ADD_JARS
Posted by Soumya Simanta <so...@gmail.com>.
Okay now if I want to add properties file to my classpath will it work?
for example, the following doesn't work.
MASTER="spark://n001:7077" ADD_JARS="/home/myuserid/
twitter4j/lib/twitter4j-core-3.0.5.jar*;twitter4j.properites*"
SPARK_CLASSPATH="/home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar
*;twitter4j.properites**" *SPARK_MEM="24G" ./spark-shell
On Tue, Feb 4, 2014 at 4:38 PM, Soumya Simanta <so...@gmail.com>wrote:
>
>
>
> On Tue, Feb 4, 2014 at 4:31 PM, Marek Wiewiorka <marek.wiewiorka@gmail.com
> > wrote:
>
>> Try adding these jars to SPARK_CLASSPATH as well.
>>
>
> Okay I changed it to the following and it works now. Thanks.
>
> MASTER="spark://n001:7077"
> ADD_JARS="/home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar" *SPARK_CLASSPATH="/home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar"
> *SPARK_MEM="24G" ./spark-shell
>
Re: Adding external jar to spark-shell classpath using ADD_JARS
Posted by Soumya Simanta <so...@gmail.com>.
On Tue, Feb 4, 2014 at 4:31 PM, Marek Wiewiorka
<ma...@gmail.com>wrote:
> Try adding these jars to SPARK_CLASSPATH as well.
>
Okay I changed it to the following and it works now. Thanks.
MASTER="spark://n001:7077"
ADD_JARS="/home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar"
*SPARK_CLASSPATH="/home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar"
*SPARK_MEM="24G" ./spark-shell
Re: Adding external jar to spark-shell classpath using ADD_JARS
Posted by Marek Wiewiorka <ma...@gmail.com>.
Try adding these jars to SPARK_CLASSPATH as well.
2014-02-04 Soumya Simanta <so...@gmail.com>:
> Hi,
>
> I've a Spark cluster where I want to use classes from 3rd party jar in my
> shell.
>
> I'm starting my spark shell using the following command.
>
>
> MASTER="spark://n001:7077"
> ADD_JARS=/home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar
> SPARK_MEM="24G" ./spark-shell
>
>
> I also see the following in the logs.
>
> 14/02/04 16:09:25 INFO SparkContext: Added JAR
> /home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar at
> http://10.27.112.32:59460/jars/twitter4j-core-3.0.5.jar with timestamp
> 1391548165483
>
>
> However, when I try to import one of the classes in that jar file I get
> the following error.
>
>
> scala> import twitter4j.Status
>
> <console>:10: error: not found: value twitter4j
>
> import twitter4j.Status
>
> ^
>