You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by yh18190 <yh...@gmail.com> on 2014/04/03 12:36:23 UTC

How to use addJar for adding external jars in spark-0.9?

Hi,

I guess their is problem with spark 0.9 version because when I tried to add
external jar jerkson_2.9.1_0.5.0 version with scala version being 2.10.3 in
cluster.
I am facing java.classNodef error becoz this jars are not being sent to
worker nodes..
Please let me know how to resolve this issue,,

val sc = new SparkContext("spark://spark-master-001:7077", "Simple
App","/opt/spark" ,
             
List("/home/ubuntu/spark-jobs/work_meghana/twitterdatasets1/target/scala-2.10/simple-project_2.10-1.0.jar"))


sc.addJar("pathofjaron master machine")





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-addJar-for-adding-external-jars-in-spark-0-9-tp3701.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: How to use addJar for adding external jars in spark-0.9?

Posted by andy petrella <an...@gmail.com>.
Could you try by building an "uber jar" with all deps... using mvn shade or
sbt assembly (or whatever that can do that ^^).
I think it'll be easier as a first step than trying to add deps
independently (and reduce the "number" of network traffic)

Andy Petrella
Belgium (Liège)

*       *********
 Data Engineer in *NextLab <http://nextlab.be/> sprl* (owner)
 Engaged Citizen Coder for *WAJUG <http://wajug.be/>* (co-founder)
 Author of *Learning Play! Framework 2
<http://www.packtpub.com/learning-play-framework-2/book>*
 Bio: on visify <https://www.vizify.com/es/52c3feec2163aa0010001eaa>
*       *********
Mobile: *+32 495 99 11 04*
Mails:

   - andy.petrella@nextlab.be
   - andy.petrella@gmail.com

*       *********
Socials:

   - Twitter: https://twitter.com/#!/noootsab
   - LinkedIn: http://be.linkedin.com/in/andypetrella
   - Blogger: http://ska-la.blogspot.com/
   - GitHub:  https://github.com/andypetrella
   - Masterbranch: https://masterbranch.com/andy.petrella



On Thu, Apr 3, 2014 at 12:36 PM, yh18190 <yh...@gmail.com> wrote:

> Hi,
>
> I guess their is problem with spark 0.9 version because when I tried to add
> external jar jerkson_2.9.1_0.5.0 version with scala version being 2.10.3 in
> cluster.
> I am facing java.classNodef error becoz this jars are not being sent to
> worker nodes..
> Please let me know how to resolve this issue,,
>
> val sc = new SparkContext("spark://spark-master-001:7077", "Simple
> App","/opt/spark" ,
>
>
> List("/home/ubuntu/spark-jobs/work_meghana/twitterdatasets1/target/scala-2.10/simple-project_2.10-1.0.jar"))
>
>
> sc.addJar("pathofjaron master machine")
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-addJar-for-adding-external-jars-in-spark-0-9-tp3701.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>