You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Robert James <sr...@gmail.com> on 2014/06/25 17:26:16 UTC
Spark's Hadooop Dependency
To add Spark to a SBT project, I do:
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
% "provided"
How do I make sure that the spark version which will be downloaded
will depend on, and use, Hadoop 2, and not Hadoop 1?
Even with a line:
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.4.0"
I still see SBT downloading Hadoop 1:
[debug] == resolving dependencies
org.apache.spark#spark-core_2.10;1.0.0->org.apache.hadoop#hadoop-client;1.0.4
[compile->master(*)]
[debug] dependency descriptor has been mediated: dependency:
org.apache.hadoop#hadoop-client;2.4.0 {compile=[default(compile)]} =>
dependency: org.apache.hadoop#hadoop-client;1.0.4
{compile=[default(compile)]}
Re: Spark's Hadooop Dependency
Posted by Koert Kuipers <ko...@tresata.com>.
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % versionSpark % "provided"
exclude("org.apache.hadoop", "hadoop-client")
"org.apache.hadoop" % "hadoop-client" % versionHadoop % "provided"
)
On Wed, Jun 25, 2014 at 11:26 AM, Robert James <sr...@gmail.com>
wrote:
> To add Spark to a SBT project, I do:
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
> % "provided"
>
> How do I make sure that the spark version which will be downloaded
> will depend on, and use, Hadoop 2, and not Hadoop 1?
>
> Even with a line:
> libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.4.0"
>
> I still see SBT downloading Hadoop 1:
>
> [debug] == resolving dependencies
>
> org.apache.spark#spark-core_2.10;1.0.0->org.apache.hadoop#hadoop-client;1.0.4
> [compile->master(*)]
> [debug] dependency descriptor has been mediated: dependency:
> org.apache.hadoop#hadoop-client;2.4.0 {compile=[default(compile)]} =>
> dependency: org.apache.hadoop#hadoop-client;1.0.4
> {compile=[default(compile)]}
>