You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by wgtmac <us...@gmail.com> on 2016/06/21 01:18:24 UTC

Build Spark 2.0 succeeded but could not run it on YARN

I ran into problems in building Spark 2.0. The build process actually
succeeded but when I uploaded to cluster and launched the Spark shell on
YARN, it reported following exceptions again and again:

16/06/17 03:32:00 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
Container marked as failed: container_e437_1464601161543_1582846_01_000013
on host: hadoopworker575-sjc1.XXXXXXXXXXXXXXXX. Exit status: 1. Diagnostics:
Exception from container-launch.
Container id: container_e437_1464601161543_1582846_01_000013
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
	at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
	at org.apache.hadoop.util.Shell.run(Shell.java:455)
	at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
	at
org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
	at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
	at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)

Container exited with a non-zero exit code 1

=============================
Build command: 

export JAVA_HOME=   // tried both java7 and java8
./dev/change-scala-version.sh 2.11   // tried both 2.10 and 2.11 
./build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
-Phive-thriftserver -DskipTests clean package

The 2.0.0-preview version downloaded from Spark website works well so it is
not the problem of my cluster. Also I can make it to build Spark 1.5 and 1.6
and run them on the cluster. But in Spark 2.0, I failed both 2.0.0-preview
tag and 2.0.0-SNAPSHOT. Anyone has any idea? Thanks!




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Build-Spark-2-0-succeeded-but-could-not-run-it-on-YARN-tp27199.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Build Spark 2.0 succeeded but could not run it on YARN

Posted by Wu Gang <us...@gmail.com>.
Hi Ted,

I didn't type any command, it just threw that exception after launched.

Thanks!

On Mon, Jun 20, 2016 at 7:18 PM, Ted Yu <yu...@gmail.com> wrote:

> What operations did you run in the Spark shell ?
>
> It would be easier for other people to reproduce using your code snippet.
>
> Thanks
>
> On Mon, Jun 20, 2016 at 6:20 PM, Jeff Zhang <zj...@gmail.com> wrote:
>
>> Could you check the yarn app logs for details ? run command "yarn logs
>> -applicationId <appid>" to get the yarn log
>>
>> On Tue, Jun 21, 2016 at 9:18 AM, wgtmac <us...@gmail.com> wrote:
>>
>>> I ran into problems in building Spark 2.0. The build process actually
>>> succeeded but when I uploaded to cluster and launched the Spark shell on
>>> YARN, it reported following exceptions again and again:
>>>
>>> 16/06/17 03:32:00 WARN
>>> cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
>>> Container marked as failed:
>>> container_e437_1464601161543_1582846_01_000013
>>> on host: hadoopworker575-sjc1.XXXXXXXXXXXXXXXX. Exit status: 1.
>>> Diagnostics:
>>> Exception from container-launch.
>>> Container id: container_e437_1464601161543_1582846_01_000013
>>> Exit code: 1
>>> Stack trace: ExitCodeException exitCode=1:
>>>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
>>>         at org.apache.hadoop.util.Shell.run(Shell.java:455)
>>>         at
>>> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
>>>         at
>>>
>>> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
>>>         at
>>>
>>> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
>>>         at
>>>
>>> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>>         at
>>>
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>         at
>>>
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>         at java.lang.Thread.run(Thread.java:745)
>>>
>>> Container exited with a non-zero exit code 1
>>>
>>> =============================
>>> Build command:
>>>
>>> export JAVA_HOME=   // tried both java7 and java8
>>> ./dev/change-scala-version.sh 2.11   // tried both 2.10 and 2.11
>>> ./build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
>>> -Phive-thriftserver -DskipTests clean package
>>>
>>> The 2.0.0-preview version downloaded from Spark website works well so it
>>> is
>>> not the problem of my cluster. Also I can make it to build Spark 1.5 and
>>> 1.6
>>> and run them on the cluster. But in Spark 2.0, I failed both
>>> 2.0.0-preview
>>> tag and 2.0.0-SNAPSHOT. Anyone has any idea? Thanks!
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Build-Spark-2-0-succeeded-but-could-not-run-it-on-YARN-tp27199.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>
>

Re: Build Spark 2.0 succeeded but could not run it on YARN

Posted by Ted Yu <yu...@gmail.com>.
What operations did you run in the Spark shell ?

It would be easier for other people to reproduce using your code snippet.

Thanks

On Mon, Jun 20, 2016 at 6:20 PM, Jeff Zhang <zj...@gmail.com> wrote:

> Could you check the yarn app logs for details ? run command "yarn logs
> -applicationId <appid>" to get the yarn log
>
> On Tue, Jun 21, 2016 at 9:18 AM, wgtmac <us...@gmail.com> wrote:
>
>> I ran into problems in building Spark 2.0. The build process actually
>> succeeded but when I uploaded to cluster and launched the Spark shell on
>> YARN, it reported following exceptions again and again:
>>
>> 16/06/17 03:32:00 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
>> Container marked as failed: container_e437_1464601161543_1582846_01_000013
>> on host: hadoopworker575-sjc1.XXXXXXXXXXXXXXXX. Exit status: 1.
>> Diagnostics:
>> Exception from container-launch.
>> Container id: container_e437_1464601161543_1582846_01_000013
>> Exit code: 1
>> Stack trace: ExitCodeException exitCode=1:
>>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
>>         at org.apache.hadoop.util.Shell.run(Shell.java:455)
>>         at
>> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
>>         at
>>
>> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
>>         at
>>
>> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
>>         at
>>
>> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
>>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>>         at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>         at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>         at java.lang.Thread.run(Thread.java:745)
>>
>> Container exited with a non-zero exit code 1
>>
>> =============================
>> Build command:
>>
>> export JAVA_HOME=   // tried both java7 and java8
>> ./dev/change-scala-version.sh 2.11   // tried both 2.10 and 2.11
>> ./build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
>> -Phive-thriftserver -DskipTests clean package
>>
>> The 2.0.0-preview version downloaded from Spark website works well so it
>> is
>> not the problem of my cluster. Also I can make it to build Spark 1.5 and
>> 1.6
>> and run them on the cluster. But in Spark 2.0, I failed both 2.0.0-preview
>> tag and 2.0.0-SNAPSHOT. Anyone has any idea? Thanks!
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Build-Spark-2-0-succeeded-but-could-not-run-it-on-YARN-tp27199.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>
>
> --
> Best Regards
>
> Jeff Zhang
>

Re: Build Spark 2.0 succeeded but could not run it on YARN

Posted by Jeff Zhang <zj...@gmail.com>.
Could you check the yarn app logs for details ? run command "yarn logs
-applicationId <appid>" to get the yarn log

On Tue, Jun 21, 2016 at 9:18 AM, wgtmac <us...@gmail.com> wrote:

> I ran into problems in building Spark 2.0. The build process actually
> succeeded but when I uploaded to cluster and launched the Spark shell on
> YARN, it reported following exceptions again and again:
>
> 16/06/17 03:32:00 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:
> Container marked as failed: container_e437_1464601161543_1582846_01_000013
> on host: hadoopworker575-sjc1.XXXXXXXXXXXXXXXX. Exit status: 1.
> Diagnostics:
> Exception from container-launch.
> Container id: container_e437_1464601161543_1582846_01_000013
> Exit code: 1
> Stack trace: ExitCodeException exitCode=1:
>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
>         at org.apache.hadoop.util.Shell.run(Shell.java:455)
>         at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
>         at
>
> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
>         at
>
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
>         at
>
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>         at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:745)
>
> Container exited with a non-zero exit code 1
>
> =============================
> Build command:
>
> export JAVA_HOME=   // tried both java7 and java8
> ./dev/change-scala-version.sh 2.11   // tried both 2.10 and 2.11
> ./build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
> -Phive-thriftserver -DskipTests clean package
>
> The 2.0.0-preview version downloaded from Spark website works well so it is
> not the problem of my cluster. Also I can make it to build Spark 1.5 and
> 1.6
> and run them on the cluster. But in Spark 2.0, I failed both 2.0.0-preview
> tag and 2.0.0-SNAPSHOT. Anyone has any idea? Thanks!
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Build-Spark-2-0-succeeded-but-could-not-run-it-on-YARN-tp27199.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>


-- 
Best Regards

Jeff Zhang