You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by "Junping Du (JIRA)" <ji...@apache.org> on 2015/12/03 17:14:11 UTC
[jira] [Commented] (MAHOUT-1770) Can mahout run on spark with
yarn-cluster mode?
[ https://issues.apache.org/jira/browse/MAHOUT-1770?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15038008#comment-15038008 ]
Junping Du commented on MAHOUT-1770:
------------------------------------
I think you are hitting SPARK-7504. Which version of Spark you are using? Upgrade to version after 1.4 should works.
> Can mahout run on spark with yarn-cluster mode?
> -----------------------------------------------
>
> Key: MAHOUT-1770
> URL: https://issues.apache.org/jira/browse/MAHOUT-1770
> Project: Mahout
> Issue Type: Question
> Affects Versions: 0.11.0
> Reporter: dodolzg
>
> I'm using mahout 0.11.0 on spark 1.3.0.
> It works fine with spark standalone, local, yarn-client, while it fails with yarn-cluster mode.
> My shell command is
> {quote}
> mahout spark-itemsimilarity \
> -i cf-data \
> -o item-sim-out \
> -ma yarn-cluster \
> --filter1 purchase \
> --filter2 view \
> -ic 2 \
> -rc 0 \
> -fc 1
> {quote}
> The error log:
> {quote}
> MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/root/apache-mahout-distribution-0.11.0/mahout-examples-0.11.0-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/root/apache-mahout-distribution-0.11.0/mahout-mr-0.11.0-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p0.4/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/root/mahout-distribution-0.9/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/root/apache-mahout-distribution-0.11.0/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See SLF4J Error Codes for an explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 15/08/27 10:20:37 INFO SparkContext: Running Spark version 1.3.0
> 15/08/27 10:20:37 INFO SecurityManager: Changing view acls to: root
> 15/08/27 10:20:37 INFO SecurityManager: Changing modify acls to: root
> 15/08/27 10:20:37 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
> 15/08/27 10:20:38 INFO Slf4jLogger: Slf4jLogger started
> 15/08/27 10:20:38 INFO Remoting: Starting remoting
> 15/08/27 10:20:38 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@TestCent7:45133]
> 15/08/27 10:20:38 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriver@TestCent7:45133]
> 15/08/27 10:20:38 INFO Utils: Successfully started service 'sparkDriver' on port 45133.
> 15/08/27 10:20:38 INFO SparkEnv: Registering MapOutputTracker
> 15/08/27 10:20:38 INFO SparkEnv: Registering BlockManagerMaster
> 15/08/27 10:20:38 INFO DiskBlockManager: Created local directory at /tmp/spark-1172a284-0c8c-40ee-9cdf-c5231d8c0974/blockmgr-a142fd9e-3c4c-46d8-ad8f-0071fb31f04b
> 15/08/27 10:20:38 INFO MemoryStore: MemoryStore started with capacity 1966.1 MB
> 15/08/27 10:20:38 INFO HttpFileServer: HTTP File server directory is /tmp/spark-c8e3861e-3822-4932-a750-01bb0fc648b9/httpd-504b3f69-834d-4618-939d-b4b089d61378
> 15/08/27 10:20:38 INFO HttpServer: Starting HTTP Server
> 15/08/27 10:20:38 INFO Server: jetty-8.y.z-SNAPSHOT
> 15/08/27 10:20:38 INFO AbstractConnector: Started SocketConnector@0.0.0.0:37966
> 15/08/27 10:20:38 INFO Utils: Successfully started service 'HTTP file server' on port 37966.
> 15/08/27 10:20:38 INFO SparkEnv: Registering OutputCommitCoordinator
> 15/08/27 10:20:38 INFO Server: jetty-8.y.z-SNAPSHOT
> 15/08/27 10:20:38 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
> 15/08/27 10:20:38 INFO Utils: Successfully started service 'SparkUI' on port 4040.
> 15/08/27 10:20:38 INFO SparkUI: Started SparkUI at Page on testcent7:4040
> 15/08/27 10:20:38 INFO SparkContext: Added JAR /root/apache-mahout-distribution-0.11.0/mahout-hdfs-0.11.0.jar at Page on 168.70.241:37966 with timestamp 1440642038652
> 15/08/27 10:20:38 INFO SparkContext: Added JAR /root/apache-mahout-distribution-0.11.0/mahout-math-0.11.0.jar at Page on 168.70.241:37966 with timestamp 1440642038654
> 15/08/27 10:20:38 INFO SparkContext: Added JAR /root/apache-mahout-distribution-0.11.0/mahout-math-scala_2.10-0.11.0.jar at Page on 168.70.241:37966 with timestamp 1440642038656
> 15/08/27 10:20:38 INFO SparkContext: Added JAR /root/apache-mahout-distribution-0.11.0/mahout-spark_2.10-0.11.0-dependency-reduced.jar at Page on 168.70.241:37966 with timestamp 1440642038661
> 15/08/27 10:20:38 INFO SparkContext: Added JAR /root/apache-mahout-distribution-0.11.0/mahout-spark_2.10-0.11.0.jar at Page on 168.70.241:37966 with timestamp 1440642038662
> 15/08/27 10:20:38 INFO SparkContext: Added JAR /root/apache-mahout-distribution-0.11.0/mahout-hdfs-0.11.0.jar at Page on 168.70.241:37966 with timestamp 1440642038662
> 15/08/27 10:20:38 INFO SparkContext: Added JAR /root/apache-mahout-distribution-0.11.0/mahout-math-0.11.0.jar at Page on 168.70.241:37966 with timestamp 1440642038665
> 15/08/27 10:20:38 INFO SparkContext: Added JAR /root/apache-mahout-distribution-0.11.0/mahout-math-scala_2.10-0.11.0.jar at Page on 168.70.241:37966 with timestamp 1440642038667
> 15/08/27 10:20:38 INFO SparkContext: Added JAR /root/apache-mahout-distribution-0.11.0/mahout-spark_2.10-0.11.0-dependency-reduced.jar at Page on 168.70.241:37966 with timestamp 1440642038674
> 15/08/27 10:20:38 INFO SparkContext: Added JAR /root/apache-mahout-distribution-0.11.0/mahout-spark_2.10-0.11.0.jar at Page on 168.70.241:37966 with timestamp 1440642038676
> 15/08/27 10:20:38 INFO YarnClusterScheduler: Created YarnClusterScheduler
> 15/08/27 10:20:38 ERROR YarnClusterSchedulerBackend: Application ID is not set.
> 15/08/27 10:20:38 INFO NettyBlockTransferService: Server created on 52390
> 15/08/27 10:20:38 INFO BlockManagerMaster: Trying to register BlockManager
> 15/08/27 10:20:38 INFO BlockManagerMasterActor: Registering block manager TestCent7:52390 with 1966.1 MB RAM, BlockManagerId(<driver>, TestCent7, 52390)
> 15/08/27 10:20:38 INFO BlockManagerMaster: Registered BlockManager
> Exception in thread "main" java.lang.NullPointerException
> at org.apache.spark.deploy.yarn.ApplicationMaster$.sparkContextInitialized(ApplicationMaster.scala:580)
> at org.apache.spark.scheduler.cluster.YarnClusterScheduler.postStartHook(YarnClusterScheduler.scala:32)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:541)
> at org.apache.mahout.sparkbindings.package$.mahoutSparkContext(package.scala:91)
> at org.apache.mahout.drivers.MahoutSparkDriver.start(MahoutSparkDriver.scala:83)
> at org.apache.mahout.drivers.ItemSimilarityDriver$.start(ItemSimilarityDriver.scala:118)
> at org.apache.mahout.drivers.ItemSimilarityDriver$.process(ItemSimilarityDriver.scala:199)
> at org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:112)
> at org.apache.mahout.drivers.ItemSimilarityDriver$$anonfun$main$1.apply(ItemSimilarityDriver.scala:110)
> at scala.Option.map(Option.scala:145)
> at org.apache.mahout.drivers.ItemSimilarityDriver$.main(ItemSimilarityDriver.scala:110)
> at org.apache.mahout.drivers.ItemSimilarityDriver.main(ItemSimilarityDriver.scala)
> {quote}
> The profile:
> {quote}
> export JAVA_HOME=/usr/java/jdk1.7.0_60/
> export HADOOP_HOME=/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p0.4/lib/hadoop
> export HADOOP_CONF_DIR=/etc/hadoop/conf
> export HADOOP_YARN_HOME=/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p0.4/lib/hadoop-yarn
> export HADOOP_YARN_CONF_DIR=/etc/hadoop/conf.cloudera.yarn
> export SPARK_HOME=/opt/cloudera/parcels/CDH-5.4.4-1.cdh5.4.4.p0.4/lib/spark
> export MAHOUT_HOME=/root/apache-mahout-distribution-0.11.0
> export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
> export PATH=$HADOOP_HOME/bin:$JAVA_HOME/bin:$SPARK_HOME/bin:$PATH
> {quote}
> I have no idea to do with this matter, please help me, Thanks!
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)