You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by rayqiu <ra...@gmail.com> on 2016/01/10 01:12:51 UTC
java.lang.NoClassDefFoundError even when use sc.addJar
Code:
val sc = new SparkContext(sparkConf)
sc.addJar("/opt/spark-1.6.0-bin-hadoop2.6/lib/spark-streaming-kafka-assembly_2.10-1.6.0.jar")
>spark-submit --class "GeoIP" target/scala-2.10/geoip-assembly-1.0.jar
Show jar added:
16/01/09 16:05:20 INFO SparkContext: Added JAR
/opt/spark-1.6.0-bin-hadoop2.6/lib/spark-streaming-kafka-assembly_2.10-1.6.0.jar
at
http://192.168.8.107:59070/jars/spark-streaming-kafka-assembly_2.10-1.6.0.jar
with timestamp 1452384320186
But still gave an error later:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/streaming/kafka/KafkaUtils$
While adding --jars to command line option ran fine without any problem:
> spark-submit --class "GeoIP" --jars
> spark-streaming-kafka-assembly_2.10-1.6.0.jar
> target/scala-2.10/geoip-assembly-1.0.jar
I understand that addJar does not help the spark shell, but this is not the
case. Can someone please help to explain this? Thanks!
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-NoClassDefFoundError-even-when-use-sc-addJar-tp25928.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org