You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Shing Hing Man <ma...@yahoo.com.INVALID> on 2014/09/01 23:08:54 UTC

Spark 1.0.2 Can GroupByTest example be run in Eclipse without change

Hi, 

I have noticed that the GroupByTest example in
https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/GroupByTest.scala
has been changed to  be run using spark-submit. 
Previously,  I set "local" as the first command line parameter, and this enable me to run GroupByTest in Eclipse. 
val sc = new SparkContext(args(0), "GroupBy Test",
System.getenv("SPARK_HOME"), SparkContext.jarOfClass(this.getClass).toSeq)


In the latest GroupByTest code, I can not pass in "local" as the first comand line parameter : 
val sparkConf = new SparkConf().setAppName("GroupBy Test")
var numMappers = if (args.length > 0) args(0).toInt else 2
var numKVPairs = if (args.length > 1) args(1).toInt else 1000
var valSize = if (args.length > 2) args(2).toInt else 1000
var numReducers = if (args.length > 3) args(3).toInt else numMappers
val sc = new SparkContext(sparkConf)


Is there a way to specify  "master=local" (maybe in an environment variable), so that I can run the latest 
version of GroupByTest in Eclipse without changing the code. 

Thanks in advance for your assistance !

Shing 

Re: Spark 1.0.2 Can GroupByTest example be run in Eclipse without change

Posted by Shing Hing Man <ma...@yahoo.com.INVALID>.
After looking at the source code of SparkConf.scala,   I found the following solution.
Just set the following Java system property :

-Dspark.master=local

Shing



On Monday, 1 September 2014, 22:09, Shing Hing Man <ma...@yahoo.com.INVALID> wrote:
 


Hi, 

I have noticed that the GroupByTest example in
https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/GroupByTest.scala
has been changed to  be run using spark-submit. 
Previously,  I set "local" as the first command line parameter, and this enable me to run GroupByTest in Eclipse. 
val sc = new SparkContext(args(0), "GroupBy Test",
System.getenv("SPARK_HOME"), SparkContext.jarOfClass(this.getClass).toSeq)


In the latest GroupByTest code, I can not pass in "local" as the first comand line parameter : 
val sparkConf = new SparkConf().setAppName("GroupBy Test")
var numMappers = if (args.length > 0) args(0).toInt else 2
var numKVPairs = if (args.length > 1) args(1).toInt else 1000
var valSize = if (args.length > 2) args(2).toInt else 1000
var numReducers = if (args.length > 3) args(3).toInt else numMappers
val sc = new SparkContext(sparkConf)


Is there a way to specify  "master=local" (maybe in an environment variable), so that I can run the latest 
version of GroupByTest in Eclipse without changing the code. 

Thanks in advance for your assistance !

Shing