You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Kyle Ellrott <ke...@soe.ucsc.edu> on 2014/02/18 01:53:44 UTC

Defining SparkShell Init?

Is there a way to define a set of commands to 'initialize' the environment
in the SparkShell?
I'd like to create a wrapper script that starts up the sparkshell and does
some boiler plate initialization (imports and variable creation) before
handing things over to me.

Kyle

Re: Defining SparkShell Init?

Posted by Andrew Ash <an...@andrewash.com>.
Why would scala 0.11 change things here? I'm not familiar with what
features you're referring.

I would support a prelude file in ~/.sparkrc our similar that is
automatically imported on spark shell startup if it exists.

Sent from my mobile phone
On Feb 17, 2014 9:11 PM, "Prashant Sharma" <sc...@gmail.com> wrote:

> There is a way to :load in shell, where you can specify the path of your
> boilerplate.scala. These things would be streamlined "once" we have scala
> 2.11 (I hope.)
>
>
> On Tue, Feb 18, 2014 at 7:20 AM, Mayur Rustagi <ma...@gmail.com>wrote:
>
>> That's actually not a bad idea. To have a shellboilerplate.scala in the
>> same folder that is used to initialize the shell.
>> Shell is a script that end of they day starts a JVM with jars from the
>> spark project , mostly you'll have to modify the spark classes and
>> reassemble using sbt. It's messy but thr may be easier ways to feed some
>> data to shell script/JVM then connect with stdin.
>> Regards
>> Mayur
>>
>> On Monday, February 17, 2014, Kyle Ellrott <ke...@soe.ucsc.edu> wrote:
>>
>>> Is there a way to define a set of commands to 'initialize' the
>>> environment in the SparkShell?
>>> I'd like to create a wrapper script that starts up the sparkshell and
>>> does some boiler plate initialization (imports and variable
>>> creation) before handing things over to me.
>>>
>>> Kyle
>>>
>>
>>
>> --
>> Sent from Gmail Mobile
>>
>
>
>
> --
> Prashant
>

Re: Defining SparkShell Init?

Posted by Prashant Sharma <sc...@gmail.com>.
There is a way to :load in shell, where you can specify the path of your
boilerplate.scala. These things would be streamlined "once" we have scala
2.11 (I hope.)


On Tue, Feb 18, 2014 at 7:20 AM, Mayur Rustagi <ma...@gmail.com>wrote:

> That's actually not a bad idea. To have a shellboilerplate.scala in the
> same folder that is used to initialize the shell.
> Shell is a script that end of they day starts a JVM with jars from the
> spark project , mostly you'll have to modify the spark classes and
> reassemble using sbt. It's messy but thr may be easier ways to feed some
> data to shell script/JVM then connect with stdin.
> Regards
> Mayur
>
> On Monday, February 17, 2014, Kyle Ellrott <ke...@soe.ucsc.edu> wrote:
>
>> Is there a way to define a set of commands to 'initialize' the
>> environment in the SparkShell?
>> I'd like to create a wrapper script that starts up the sparkshell and
>> does some boiler plate initialization (imports and variable
>> creation) before handing things over to me.
>>
>> Kyle
>>
>
>
> --
> Sent from Gmail Mobile
>



-- 
Prashant

Re: Defining SparkShell Init?

Posted by Mayur Rustagi <ma...@gmail.com>.
That's actually not a bad idea. To have a shellboilerplate.scala in the
same folder that is used to initialize the shell.
Shell is a script that end of they day starts a JVM with jars from the
spark project , mostly you'll have to modify the spark classes and
reassemble using sbt. It's messy but thr may be easier ways to feed some
data to shell script/JVM then connect with stdin.
Regards
Mayur

On Monday, February 17, 2014, Kyle Ellrott <ke...@soe.ucsc.edu> wrote:

> Is there a way to define a set of commands to 'initialize' the environment
> in the SparkShell?
> I'd like to create a wrapper script that starts up the sparkshell and does
> some boiler plate initialization (imports and variable creation) before
> handing things over to me.
>
> Kyle
>


-- 
Sent from Gmail Mobile