You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Kelly, Jonathan" <jo...@amazon.com> on 2015/07/06 21:24:57 UTC

Run sparkR non-interactively

Is there any way to run sparkR non-interactively? I'd like to run it in an integration test, but it does not seem to accept any parameter that specifies a script to run, similarly to -f with spark-sql. If I try to pipe in something via stdin, I get the following:

$ echo "demo('lm.glm')" | sparkR
Fatal error: you must specify '--save', '--no-save' or '--vanilla'

Trying to pass --no-save to sparkR doesn't work either, I assume because these options are not actually being passed through to the R shell.

$ echo "demo('lm.glm')" | sparkR --no-save
Fatal error: you must specify '--save', '--no-save' or '--vanilla'

Thanks,
Jonathan

Re: Run sparkR non-interactively

Posted by "Kelly, Jonathan" <jo...@amazon.com>.
Yeah, I noticed right after I sent this email that I could just spark-submit an R file, which works for me. Thanks!

~ Jonathan

From: Shivaram Venkataraman <sh...@eecs.berkeley.edu>>
Reply-To: Shivaram Venkataraman <sh...@eecs.berkeley.edu>>
Date: Monday, July 6, 2015 at 4:39 PM
To: Jonathan Kelly <jo...@amazon.com>>
Subject: Re: Run sparkR non-interactively

You can run it by passing in a R file as the argument using either bin/spark-submit or bin/sparkR. For example `./bin/spark-submit examples/src/main/r/dataframe.R` should work when run locally.

Thanks
Shivaram


On Mon, Jul 6, 2015 at 12:24 PM, Kelly, Jonathan <jo...@amazon.com>> wrote:
Is there any way to run sparkR non-interactively? I'd like to run it in an integration test, but it does not seem to accept any parameter that specifies a script to run, similarly to -f with spark-sql. If I try to pipe in something via stdin, I get the following:

$ echo "demo('lm.glm')" | sparkR
Fatal error: you must specify '--save', '--no-save' or '--vanilla'

Trying to pass --no-save to sparkR doesn't work either, I assume because these options are not actually being passed through to the R shell.

$ echo "demo('lm.glm')" | sparkR --no-save
Fatal error: you must specify '--save', '--no-save' or '--vanilla'

Thanks,
Jonathan