You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Fengdong Yu <fe...@everstring.com> on 2015/10/12 10:48:11 UTC

how to use SharedSparkContext

Hi, 
How to add dependency in build.sbt  if I want to use SharedSparkContext?

I’ve added spark-core, but it doesn’t work.(cannot find SharedSparkContext)



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: how to use SharedSparkContext

Posted by Fengdong Yu <fe...@everstring.com>.
oh, 
Yes. Thanks much.



> On Oct 14, 2015, at 18:47, Akhil Das <ak...@sigmoidanalytics.com> wrote:
> 
> com.holdenkarau.spark.testing


Re: how to use SharedSparkContext

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Did a quick search and found the following, I haven't tested it myself.

Add the following to your build.sbt

libraryDependencies += "com.holdenkarau" % "spark-testing-base_2.10" %
"1.5.0_1.4.0_1.4.1_0.1.2"



Create a class extending com.holdenkarau.spark.testing.SharedSparkContext

And you should be able to use it.

Thanks
Best Regards

On Mon, Oct 12, 2015 at 2:18 PM, Fengdong Yu <fe...@everstring.com>
wrote:

> Hi,
> How to add dependency in build.sbt  if I want to use SharedSparkContext?
>
> I’ve added spark-core, but it doesn’t work.(cannot find SharedSparkContext)
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>