You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ajay Viswanathan (JIRA)" <ji...@apache.org> on 2014/06/04 21:05:01 UTC

[jira] [Created] (SPARK-2020) Spark 1.0.0 fails to run in coarse-grained mesos mode

Ajay Viswanathan created SPARK-2020:
---------------------------------------

             Summary: Spark 1.0.0 fails to run in coarse-grained mesos mode
                 Key: SPARK-2020
                 URL: https://issues.apache.org/jira/browse/SPARK-2020
             Project: Spark
          Issue Type: Bug
          Components: Mesos
    Affects Versions: 1.0.0
         Environment: Ubuntu 14.04, 64-bit
8GB RAM
            Reporter: Ajay Viswanathan


I am using Mesos to run Spark applications on a cluster.
Earlier, in Spark 0.9.1 and below, I could run tasks in coarse-grained more on the workers; but now, when I try to do the same in Spark 1.0.0, I get an exception preventing me from running the tasks. Fine-grained mode works fine in Spark 1.0.0 though.

Snippet of stderr - 
Executor registered on slave
Exception in thread "main" java.lang.NumberFormatException: For input string: "<ip>"
        at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
        at java.lang.Integer.parseInt(Integer.java:492)
        at java.lang.Integer.parseInt(Integer.java:527)
        at scala.collection.immutable.StringLike$class.toInt(StringLike.scala:229)
        at scala.collection.immutable.StringOps.toInt(StringOps.scala:31)
        at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:135)
        at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)

Running the spark application connected to mesos master throws an error - Is Spark installed on it?



--
This message was sent by Atlassian JIRA
(v6.2#6252)