You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "liucheng (Jira)" <ji...@apache.org> on 2020/06/18 12:02:00 UTC

[jira] [Created] (SPARK-32022) Can many executors share one gpu for spark3.0?

liucheng created SPARK-32022:
--------------------------------

             Summary: Can many executors share one gpu for spark3.0?
                 Key: SPARK-32022
                 URL: https://issues.apache.org/jira/browse/SPARK-32022
             Project: Spark
          Issue Type: Question
          Components: Spark Core
    Affects Versions: 3.0.0
         Environment: spark3.0 + Hadoop3 + yarn cluster 
            Reporter: liucheng


hi, I want to run mang executors(for example, 2) in a server with only one GPU card,I tested spark3.0 in yarn cluster mode with the following config:

spark-shell  --conf spark.executor.resource.gpu.amount=0.5 

 

Then ,I find the following errors:

 

20/06/18 16:24:46 [main] {color:#FF0000}ERROR{color} SparkContext: Error initializing SparkContext.
java.lang.NumberFormatException:{color:#FF0000} For input string: "0.5"{color}
 at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
 at java.lang.Integer.parseInt(Integer.java:580)
 at java.lang.Integer.parseInt(Integer.java:615)
 at scala.collection.immutable.StringLike.toInt(StringLike.scala:304)
 at scala.collection.immutable.StringLike.toInt$(StringLike.scala:304)
 at scala.collection.immutable.StringOps.toInt(StringOps.scala:33)
 at org.apache.spark.resource.ResourceUtils$.parseResourceRequest(ResourceUtils.scala:142)
 at org.apache.spark.resource.ResourceUtils$.$anonfun$parseAllResourceRequests$1(ResourceUtils.scala:159)
 at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
 at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:75)
 at scala.collection.TraversableLike.map(TraversableLike.scala:238)
 at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
 at scala.collection.AbstractTraversable.map(Traversable.scala:108)
 at org.apache.spark.resource.ResourceUtils$.parseAllResourceRequests(ResourceUtils.scala:159)
 at org.apache.spark.SparkContext$.checkResourcesPerTask$1(SparkContext.scala:2773)
 at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2921)
 at org.apache.spark.SparkContext.<init>(SparkContext.scala:528)
 at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555)
 at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:931)
 at scala.Option.getOrElse(Option.scala:189)

 

。。。。

 

My question:for spark3.0 Can one gpu card support many executors ?

Thank you!

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org