You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (Jira)" <ji...@apache.org> on 2020/09/08 15:23:00 UTC

[jira] [Created] (SPARK-32824) The error is confusing when resource .amount not provided

Thomas Graves created SPARK-32824:
-------------------------------------

             Summary: The error is confusing when resource .amount not provided 
                 Key: SPARK-32824
                 URL: https://issues.apache.org/jira/browse/SPARK-32824
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.0.0
            Reporter: Thomas Graves


If the user forgets to specify the .amount when specifying a resource, the error that comes out is confusing, we should improve.

 

$ $SPARK_HOME/bin/spark-shell  --master spark://host9:7077 --conf spark.executor.resource.gpu=1

 
{code:java}
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.propertiesSetting default log level to "WARN".To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).20/09/08 08:19:35 ERROR SparkContext: Error initializing SparkContext.java.lang.StringIndexOutOfBoundsException: String index out of range: -1 at java.lang.String.substring(String.java:1967) at org.apache.spark.resource.ResourceUtils$.$anonfun$listResourceIds$1(ResourceUtils.scala:151) at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238) at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36) at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198) at scala.collection.TraversableLike.map(TraversableLike.scala:238) at scala.collection.TraversableLike.map$(TraversableLike.scala:231) at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198) at org.apache.spark.resource.ResourceUtils$.listResourceIds(ResourceUtils.scala:150) at org.apache.spark.resource.ResourceUtils$.parseAllResourceRequests(ResourceUtils.scala:158) at org.apache.spark.SparkContext$.checkResourcesPerTask$1(SparkContext.scala:2773) at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2884) at org.apache.spark.SparkContext.<init>(SparkContext.scala:528) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555) at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106) at $line3.$read$$iw$$iw.<init>(<console>:15) at $line3.$read$$iw.<init>(<console>:42) at $line3.$read.<init>(<console>:44){code}
'



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org