You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/11/12 11:11:35 UTC

[jira] [Commented] (SPARK-4360) task only execute on one node when spark on yarn

    [ https://issues.apache.org/jira/browse/SPARK-4360?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14207894#comment-14207894 ] 

Sean Owen commented on SPARK-4360:
----------------------------------

I don't think there's enough info here; this maybe should have been a question on the list first.

Is there more than 1 partition in the input? did more than 1 executor actually allocate? are you definitely observing tasks running and not some single-threaded process on the driver?

> task only execute on one node when spark on yarn
> ------------------------------------------------
>
>                 Key: SPARK-4360
>                 URL: https://issues.apache.org/jira/browse/SPARK-4360
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.0.2
>            Reporter: seekerak
>
> hadoop version: hadoop 2.0.3-alpha
> spark version: 1.0.2
> when i run spark jobs on yarn, i found all the task only run on one node, my cluster has 4 nodes, executors has 3, but only one has task, the others hasn't, my command like this :
> /opt/hadoopcluster/spark-1.0.2-bin-hadoop2/bin/spark-submit --class org.sr.scala.Spark_LineCount_G0 --executor-memory 2G --num-executors 12 --master yarn-cluster /home/Spark_G0.jar /data /output/ou_1
> is there any one knows why?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org