You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by yaoxin <ya...@gmail.com> on 2014/02/21 05:09:13 UTC

Re: SparkContext startup time out

I setup a project in idea with libraryDependencies  "org.apache.spark" %
"spark-core_2.10" % "0.9.0-incubating".

This project only contain one object.

Should I run this in a spark cluster or I missed some library.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1754p1868.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: SparkContext startup time out

Posted by Mayur Rustagi <ma...@gmail.com>.
Yes just sbt run

Mayur Rustagi
Ph: +919632149971
h <https://twitter.com/mayur_rustagi>ttp://www.sigmoidanalytics.com
https://twitter.com/mayur_rustagi



On Thu, Feb 20, 2014 at 8:09 PM, yaoxin <ya...@gmail.com> wrote:

> I setup a project in idea with libraryDependencies  "org.apache.spark" %
> "spark-core_2.10" % "0.9.0-incubating".
>
> This project only contain one object.
>
> Should I run this in a spark cluster or I missed some library.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1754p1868.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>