You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/05/06 09:35:00 UTC
[jira] [Resolved] (SPARK-7395) some suggestion about SimpleApp in
quick-start.html
[ https://issues.apache.org/jira/browse/SPARK-7395?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-7395.
------------------------------
Resolution: Not A Problem
Fix Version/s: (was: 1.4.0)
Target Version/s: (was: 1.3.1)
Please review https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark first
Don't set target or fix version.
You set a master when you run the example with spark-submit, not in the code.
> some suggestion about SimpleApp in quick-start.html
> ---------------------------------------------------
>
> Key: SPARK-7395
> URL: https://issues.apache.org/jira/browse/SPARK-7395
> Project: Spark
> Issue Type: Documentation
> Components: Documentation
> Affects Versions: 1.3.1
> Environment: none
> Reporter: zhengbing li
> Original Estimate: 12h
> Remaining Estimate: 12h
>
> Base on the code guide of SimpleApp in https://spark.apache.org/docs/latest/quick-start.html, I could not run the SimpleApp code until I modify "val conf = new SparkConf().setAppName("Simple Application")" to "val conf = new SparkConf().setAppName("Simple Application").setMaster("local")".
> So the document might be modified for the beginners.
> The error of scala example is as follows:
> 15/05/06 15:05:48 INFO SparkContext: Running Spark version 1.3.0
> Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:206)
> at com.huawei.openspark.TestSpark$.main(TestSpark.scala:12)
> at com.huawei.openspark.TestSpark.main(TestSpark.scala)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org