You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Patcharee Thongtra <Pa...@uni.no> on 2015/03/11 16:37:23 UTC
bad symbolic reference. A signature in SparkContext.class refers
to term conf in value org.apache.hadoop which is not available
Hi,
I have built spark version 1.3 and tried to use this in my spark scala
application. When I tried to compile and build the application by SBT, I
got error>
bad symbolic reference. A signature in SparkContext.class refers to term
conf in value org.apache.hadoop which is not available
It seems hadoop library is missing, but it should be referred
automatically by SBT, isn't it.
This application is buit-able on spark version 1.2
Here is my build.sbt
name := "wind25t-v013"
version := "0.1"
scalaVersion := "2.10.4"
unmanagedBase := baseDirectory.value / "lib"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0"
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.3.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.3.0"
libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.3.0"
What should I do to fix it?
BR,
Patcharee
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
Re: bad symbolic reference. A signature in SparkContext.class refers
to term conf in value org.apache.hadoop which is not available
Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Spark 1.3.0 is not officially out yet, so i don't think sbt will download
the hadoop dependencies for your spark by itself. You could try manually
adding the hadoop dependencies yourself (hadoop-core, hadoop-common,
hadoop-client)
Thanks
Best Regards
On Wed, Mar 11, 2015 at 9:07 PM, Patcharee Thongtra <
Patcharee.Thongtra@uni.no> wrote:
> Hi,
>
> I have built spark version 1.3 and tried to use this in my spark scala
> application. When I tried to compile and build the application by SBT, I
> got error>
> bad symbolic reference. A signature in SparkContext.class refers to term
> conf in value org.apache.hadoop which is not available
>
> It seems hadoop library is missing, but it should be referred
> automatically by SBT, isn't it.
>
> This application is buit-able on spark version 1.2
>
> Here is my build.sbt
>
> name := "wind25t-v013"
> version := "0.1"
> scalaVersion := "2.10.4"
> unmanagedBase := baseDirectory.value / "lib"
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0"
> libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.3.0"
> libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.3.0"
> libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.3.0"
>
> What should I do to fix it?
>
> BR,
> Patcharee
>
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>