You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Brenden Matthews (JIRA)" <ji...@apache.org> on 2014/10/30 21:34:37 UTC
[jira] [Created] (SPARK-4158) Spark throws exception when Mesos
resources are missing
Brenden Matthews created SPARK-4158:
---------------------------------------
Summary: Spark throws exception when Mesos resources are missing
Key: SPARK-4158
URL: https://issues.apache.org/jira/browse/SPARK-4158
Project: Spark
Issue Type: Bug
Components: Mesos
Affects Versions: 1.1.0
Reporter: Brenden Matthews
Spark throws an exception when trying to check resources which haven't been offered by Mesos. This is an error in Spark, and should be corrected as such. Here's a sample:
{{monospaced}}
val data Exception in thread "Thread-41" java.lang.IllegalArgumentException: No resource called cpus in [name: "mem"
type: SCALAR
scalar {
value: 2067.0
}
role: "*"
, name: "disk"
type: SCALAR
scalar {
value: 900.0
}
role: "*"
, name: "ports"
type: RANGES
ranges {
range {
begin: 31000
end: 32000
}
}
role: "*"
]
at org.apache.spark.scheduler.cluster.mesos.CoarseMesosSchedulerBackend.org$apache$spark$scheduler$cluster$mesos$CoarseMesosSchedulerBackend$$getResource(CoarseMesosSchedulerBackend.scala:236)
at org.apache.spark.scheduler.cluster.mesos.CoarseMesosSchedulerBackend$$anonfun$resourceOffers$1.apply(CoarseMesosSchedulerBackend.scala:200)
at org.apache.spark.scheduler.cluster.mesos.CoarseMesosSchedulerBackend$$anonfun$resourceOffers$1.apply(CoarseMesosSchedulerBackend.scala:197)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at org.apache.spark.scheduler.cluster.mesos.CoarseMesosSchedulerBackend.resourceOffers(CoarseMesosSchedulerBackend.scala:197)
{{monospaced}}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org