You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Shihaoliang (Shihaoliang)" <sh...@huawei.com> on 2014/04/28 11:32:31 UTC

Spark 1.0 run job fail

Hi all,

I get spark 1.0 snapshot code from git; and I compiled it using command:
mvn -Pbigtop-dist -Dhadoop.version=2.3.0 -Dyarn.version=2.3.0 -DskipTests package -e

in cluster, I add [export SPARK_YARN_MODE=true] to spark-env.sh, and run HdfsTest examples;

and I got error, any one got similar issue?

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/nosec/spark/lib/spark-examples_2.10-assembly-0.9.0-incubating.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/nosec/spark/assembly/target/scala-2.10/spark-assembly_2.10-1.0.0-SNAPSHOT-hadoop2.3.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]
1 [main] INFO org.apache.spark.SecurityManager - SecurityManager, is authentication enabled: false are ui acls enabled: false users with view permissions: Set(hadoop)
587 [spark-akka.actor.default-dispatcher-2] INFO akka.event.slf4j.Slf4jLogger - Slf4jLogger started
674 [spark-akka.actor.default-dispatcher-2] INFO Remoting - Starting remoting
888 [spark-akka.actor.default-dispatcher-5] INFO Remoting - Remoting started; listening on addresses :[akka.tcp://spark@VM-6.com:45972]
888 [spark-akka.actor.default-dispatcher-5] INFO Remoting - Remoting now listens on addresses: [akka.tcp://spark@VM-6.com:45972]
900 [main] INFO org.apache.spark.SparkEnv - Registering MapOutputTracker
903 [main] INFO org.apache.spark.SparkEnv - Registering BlockManagerMaster
932 [main] INFO org.apache.spark.storage.DiskBlockManager - Created local directory at /opt/nosec/spark/local/spark-local-20140429013124-3e55
936 [main] INFO org.apache.spark.storage.MemoryStore - MemoryStore started with capacity 297.0 MB.
966 [main] INFO org.apache.spark.network.ConnectionManager - Bound socket to port 59708 with id = ConnectionManagerId(VM-6.com,59708)
971 [main] INFO org.apache.spark.storage.BlockManagerMaster - Trying to register BlockManager
973 [spark-akka.actor.default-dispatcher-5] INFO org.apache.spark.storage.BlockManagerInfo - Registering block manager VM-6.com:59708 with 297.0 MB RAM
974 [main] INFO org.apache.spark.storage.BlockManagerMaster - Registered BlockManager
989 [main] INFO org.apache.spark.HttpServer - Starting HTTP Server
1888 [main] INFO org.eclipse.jetty.server.Server - jetty-7.x.y-SNAPSHOT
1909 [main] INFO org.eclipse.jetty.server.AbstractConnector - Started SocketConnector@0.0.0.0:58102
1910 [main] INFO org.apache.spark.broadcast.HttpBroadcast - Broadcast server started at http://9.91.11.28:58102
1918 [main] INFO org.apache.spark.HttpFileServer - HTTP File server directory is /tmp/spark-375beb29-df90-4b87-ab07-2a6855daf342
1918 [main] INFO org.apache.spark.HttpServer - Starting HTTP Server
1918 [main] INFO org.eclipse.jetty.server.Server - jetty-7.x.y-SNAPSHOT
1920 [main] INFO org.eclipse.jetty.server.AbstractConnector - Started SocketConnector@0.0.0.0:52144
2298 [main] INFO org.eclipse.jetty.server.Server - jetty-7.x.y-SNAPSHOT
2299 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/stages,null}
2301 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/stages/json,null}
2301 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/stages/stage,null}
2301 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/stages/stage/json,null}
2301 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/stages/pool,null}
2301 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/stages/pool/json,null}
2302 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/storage,null}
2302 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/storage/json,null}
2302 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/storage/rdd,null}
2302 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
2302 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/environment,null}
2302 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/environment/json,null}
2302 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/executors,null}
2302 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/executors/json,null}
2303 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/static,null}
2303 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/,null}
2303 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
2303 [main] INFO org.eclipse.jetty.server.handler.ContextHandler - started o.e.j.s.ServletContextHandler{/metrics/json,null}
2313 [main] INFO org.eclipse.jetty.server.AbstractConnector - Started SelectChannelConnector@0.0.0.0:4040
2314 [main] INFO org.apache.spark.ui.SparkUI - Started SparkUI at http://VM-6.com:4040
Exception in thread "main" java.lang.ExceptionInInitializerError
        at org.apache.spark.SparkContext.addJar(SparkContext.scala:894)
        at org.apache.spark.SparkContext$$anonfun$5.apply(SparkContext.scala:234)
        at org.apache.spark.SparkContext$$anonfun$5.apply(SparkContext.scala:234)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:234)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:110)
        at org.apache.spark.examples.HdfsTest$.main(HdfsTest.scala:24)
        at org.apache.spark.examples.HdfsTest.main(HdfsTest.scala)
Caused by: org.apache.spark.SparkException: Unable to load YARN support
        at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:91)
        at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:86)
        at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
        ... 9 more
Caused by: java.lang.ClassNotFoundException: org.apache.spark.deploy.yarn.YarnSparkHadoopUtil
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:186)
        at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:87)
        ... 11 more


Thanks.
Peter Shi