You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Divya Gehlot <di...@gmail.com> on 2016/03/09 10:11:49 UTC

[Error]Run Spark job as hdfs user from oozie workflow

Hi,
I have non secure  Hadoop 2.7.2 cluster on EC2 having Spark 1.5.2
When I am submitting my spark scala script through shell script using Oozie
workflow.
I am submitting job as hdfs user but It is running as user = "yarn" so all
the output should get store under user/yarn directory only .

When I googled and got YARN-2424
<https://issues.apache.org/jira/browse/YARN-2424> for non secure cluster
I changed the settings as per this docs
<https://www.ibm.com/support/knowledgecenter/SSPT3X_4.1.0/com.ibm.swg.im.infosphere.biginsights.install.doc/doc/inst_adv_yarn_config.html>
and when I ran my Oozie workflow as hdfs user  got below error

Application application_1457494230162_0004 failed 2 times due to AM
Container for appattempt_1457494230162_0004_000002 exited with exitCode:
-1000
For more detailed output, check application tracking page:
http://ip-xxx-xx-xx-xxx.ap-southeast-1.compute.internal:8088/cluster/app/application_1457494230162_0004Then
<http://ip-172-31-29-201.ap-southeast-1.compute.internal:8088/cluster/app/application_1457494230162_0004Then>,
click on links to logs of each attempt.
Diagnostics: Application application_1457494230162_0004 initialization
failed (exitCode=255) with output: main : command provided 0
main : run as user is hdfs
main : requested yarn user is hdfs
Can't create directory
/hadoop/yarn/local/usercache/hdfs/appcache/application_1457494230162_0004 -
Permission denied
Did not create any app directories
Failing this attempt. Failing the application.

After changing the settiing when I start spark shell
I got error saying that Error starting SQLContext -Yarn application has
ended

Has anybody ran into these kind of issues?
Would really appreciate if you could guide me to the steps/docs to resolve
it.


Thanks,
Divya