You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by James <al...@gmail.com> on 2015/01/20 13:35:07 UTC

not found: type LocalSparkContext

Hi all,

When I was trying to write a test on my spark application I met

```
Error:(14, 43) not found: type LocalSparkContext
class HyperANFSuite extends FunSuite with LocalSparkContext {
```

At the source code of spark-core I could not found "LocalSparkContext",
thus I wonder how to write a test like [this] (
https://github.com/apache/spark/blob/master/graphx/src/test/scala/org/apache/spark/graphx/lib/ConnectedComponentsSuite.scala
)

Alcaid

Re: not found: type LocalSparkContext

Posted by Reynold Xin <rx...@databricks.com>.
You don't need the LocalSparkContext. It is only for Spark's own unit test.

You can just create a SparkContext and use it in your unit tests, e.g.

val sc = new SparkContext("local", "my test app", new SparkConf)

On Tue, Jan 20, 2015 at 7:27 PM, James <al...@gmail.com> wrote:

> I could not correctly import org.apache.spark.LocalSparkContext,
>
> I use sbt on Intellij for developing,here is my build sbt.
>
> ```
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
>
> libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.2.0"
>
> libraryDependencies += "com.clearspring.analytics" % "stream" % "2.7.0"
>
> libraryDependencies += "org.scalatest" % "scalatest_2.10" % "2.0"
>
> resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
> ```
>
> I think maybe I have make some mistakes on the library setting, as a new
> developer of spark application, I wonder what is the standard procedure of
> developing a spark application.
>
> Any reply is appreciated.
>
>
> Alcaid
>
>
> 2015-01-21 2:05 GMT+08:00 Will Benton <wi...@redhat.com>:
>
> > It's declared here:
> >
> >
> >
> https://github.com/apache/spark/blob/master/core/src/test/scala/org/apache/spark/LocalSparkContext.scala
> >
> > I assume you're already importing LocalSparkContext, but since the test
> > classes aren't included in Spark packages, you'll also need to package
> them
> > up in order to use them in your application (viz., outside of Spark).
> >
> >
> >
> > best,
> > wb
> >
> > ----- Original Message -----
> > > From: "James" <al...@gmail.com>
> > > To: dev@spark.apache.org
> > > Sent: Tuesday, January 20, 2015 6:35:07 AM
> > > Subject: not found: type LocalSparkContext
> > >
> > > Hi all,
> > >
> > > When I was trying to write a test on my spark application I met
> > >
> > > ```
> > > Error:(14, 43) not found: type LocalSparkContext
> > > class HyperANFSuite extends FunSuite with LocalSparkContext {
> > > ```
> > >
> > > At the source code of spark-core I could not found "LocalSparkContext",
> > > thus I wonder how to write a test like [this] (
> > >
> >
> https://github.com/apache/spark/blob/master/graphx/src/test/scala/org/apache/spark/graphx/lib/ConnectedComponentsSuite.scala
> > > )
> > >
> > > Alcaid
> > >
> >
>

Re: not found: type LocalSparkContext

Posted by James <al...@gmail.com>.
I could not correctly import org.apache.spark.LocalSparkContext,

I use sbt on Intellij for developing,here is my build sbt.

```
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.2.0"

libraryDependencies += "com.clearspring.analytics" % "stream" % "2.7.0"

libraryDependencies += "org.scalatest" % "scalatest_2.10" % "2.0"

resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
```

I think maybe I have make some mistakes on the library setting, as a new
developer of spark application, I wonder what is the standard procedure of
developing a spark application.

Any reply is appreciated.


Alcaid


2015-01-21 2:05 GMT+08:00 Will Benton <wi...@redhat.com>:

> It's declared here:
>
>
> https://github.com/apache/spark/blob/master/core/src/test/scala/org/apache/spark/LocalSparkContext.scala
>
> I assume you're already importing LocalSparkContext, but since the test
> classes aren't included in Spark packages, you'll also need to package them
> up in order to use them in your application (viz., outside of Spark).
>
>
>
> best,
> wb
>
> ----- Original Message -----
> > From: "James" <al...@gmail.com>
> > To: dev@spark.apache.org
> > Sent: Tuesday, January 20, 2015 6:35:07 AM
> > Subject: not found: type LocalSparkContext
> >
> > Hi all,
> >
> > When I was trying to write a test on my spark application I met
> >
> > ```
> > Error:(14, 43) not found: type LocalSparkContext
> > class HyperANFSuite extends FunSuite with LocalSparkContext {
> > ```
> >
> > At the source code of spark-core I could not found "LocalSparkContext",
> > thus I wonder how to write a test like [this] (
> >
> https://github.com/apache/spark/blob/master/graphx/src/test/scala/org/apache/spark/graphx/lib/ConnectedComponentsSuite.scala
> > )
> >
> > Alcaid
> >
>

Re: not found: type LocalSparkContext

Posted by Will Benton <wi...@redhat.com>.
It's declared here:

  https://github.com/apache/spark/blob/master/core/src/test/scala/org/apache/spark/LocalSparkContext.scala

I assume you're already importing LocalSparkContext, but since the test classes aren't included in Spark packages, you'll also need to package them up in order to use them in your application (viz., outside of Spark).



best,
wb

----- Original Message -----
> From: "James" <al...@gmail.com>
> To: dev@spark.apache.org
> Sent: Tuesday, January 20, 2015 6:35:07 AM
> Subject: not found: type LocalSparkContext
> 
> Hi all,
> 
> When I was trying to write a test on my spark application I met
> 
> ```
> Error:(14, 43) not found: type LocalSparkContext
> class HyperANFSuite extends FunSuite with LocalSparkContext {
> ```
> 
> At the source code of spark-core I could not found "LocalSparkContext",
> thus I wonder how to write a test like [this] (
> https://github.com/apache/spark/blob/master/graphx/src/test/scala/org/apache/spark/graphx/lib/ConnectedComponentsSuite.scala
> )
> 
> Alcaid
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org