You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (JIRA)" <ji...@apache.org> on 2014/12/26 02:21:13 UTC

[jira] [Created] (SPARK-4970) Fix an implicit bug in SparkSubmitSuite

Takeshi Yamamuro created SPARK-4970:
---------------------------------------

             Summary: Fix an implicit bug in SparkSubmitSuite
                 Key: SPARK-4970
                 URL: https://issues.apache.org/jira/browse/SPARK-4970
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
            Reporter: Takeshi Yamamuro
            Priority: Minor


The test 'includes jars passed in through --jars’ in SparkSubmitSuite fails
when spark.executor.memory is set at over 512MiB in conf/spark-default.conf.

An exception is thrown as follows:

Exception in thread "main" org.apache.spark.SparkException: Asked to launch cluster with 512 MB RAM / worker but requested 1024 MB/worker
	at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:1889)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:322)
	at org.apache.spark.deploy.JarCreationTest$.main(SparkSubmitSuite.scala:458)
	at org.apache.spark.deploy.JarCreationTest.main(SparkSubmitSuite.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:367)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org