You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ji ZHANG <zh...@gmail.com> on 2014/09/20 08:20:03 UTC
How to Exclude Spark Dependencies from spark-streaming-kafka?
Hi,
I'm developing an application with spark-streaming-kafka, which
depends on spark-streaming and kafka. Since spark-streaming is
provided in runtime, I want to exclude the jars from the assembly. I
tried the following configuration:
libraryDependencies ++= {
val sparkVersion = "1.0.2"
Seq(
"org.apache.spark" %% "spark-streaming-kafka" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
)
}
Not working. I also tried:
libraryDependencies ++= {
val sparkVersion = "1.0.2"
Seq(
("org.apache.spark" %% "spark-streaming-kafka" % sparkVersion).
exclude("org.apache.spark", "spark-streaming"),
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
)
}
It still package all the jars of spark-streaming. Finally I come up with:
libraryDependencies ++= {
val sparkVersion = "1.0.2"
Seq(
"org.apache.spark" %% "spark-streaming-kafka" % sparkVersion intransitive(),
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
("org.apache.kafka" %% "kafka" % "0.8.0").
exclude("com.sun.jmx", "jmxri").
exclude("com.sun.jdmk", "jmxtools").
exclude("net.sf.jopt-simple", "jopt-simple").
exclude("org.slf4j", "slf4j-simple").
exclude("org.apache.zookeeper", "zookeeper")
)
}
Wordy, but works. So I'm wondering whether there's a better way.
Thanks.
--
Jerry
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org