You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ia...@tdameritrade.com on 2016/11/09 20:11:18 UTC

Issue Running sparkR on YARN

Hi,

I’m trying to run sparkR (1.5.2) on YARN and I get:

 java.io.IOException: Cannot run program "Rscript": error=2, No such file or directory

This strikes me as odd, because I can go to each node and various users and type Rscript and it works. I’ve done this on each node and spark-env.sh as well: export R_HOME=/path/to/R

This is how I’m setting it on the nodes (/etc/profile.d/path_edit.sh):

export R_HOME=/app/hdp_app/anaconda/bin/R
PATH=$PATH:/app/hdp_app/anaconda/bin

Any ideas?

Thanks,

Ian

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Issue Running sparkR on YARN

Posted by Felix Cheung <fe...@hotmail.com>.
It maybe the Spark executor is running as a different user and it can't see where RScript is?

You might want to try putting Rscript path to PATH.

Also please see this for the config property to set for the R command to use:
https://spark.apache.org/docs/latest/configuration.html#sparkr



_____________________________
From: ian.maloney@tdameritrade.com<ma...@tdameritrade.com>
Sent: Wednesday, November 9, 2016 12:12 PM
Subject: Issue Running sparkR on YARN
To: <us...@spark.apache.org>>


Hi,

I'm trying to run sparkR (1.5.2) on YARN and I get:

java.io.IOException: Cannot run program "Rscript": error=2, No such file or directory

This strikes me as odd, because I can go to each node and various users and type Rscript and it works. I've done this on each node and spark-env.sh as well: export R_HOME=/path/to/R

This is how I'm setting it on the nodes (/etc/profile.d/path_edit.sh):

export R_HOME=/app/hdp_app/anaconda/bin/R
PATH=$PATH:/app/hdp_app/anaconda/bin

Any ideas?

Thanks,

Ian

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org<ma...@spark.apache.org>