You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jianshi Huang <ji...@gmail.com> on 2014/09/03 10:47:24 UTC

.sparkrc for Spark shell?

To make my shell experience merrier, I need to import several packages, and
define implicit sparkContext and sqlContext.

Is there a startup file (e.g. ~/.sparkrc) that Spark shell will load when
it's started?


Cheers,
-- 
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/

Re: .sparkrc for Spark shell?

Posted by "Dimension Data, LLC." <su...@didata.us>.
Hello:

Question...

Is the below -- more or less -- basically equivalent to doing this for 
pyspark:

    user$ export PYTHONSTARTUP=/path/to/my/pythonStartup.py
    user$ pyspark

Actually, this is how I start a pyspark... by reverse engineering how 
pyspark
starts, I wrote a broader pythonStartup.py script so that, among other
things, I adds the the environment & imports that I need (numpy, matplotlib,
scipy, etc), and also can use it like this:

  >> python  -i /path/to/my/pythonStartup.py
  >> bpython -i /path/to/my/pythonStartup.py (excellent for it's code 
intelligence / completion).
  >> And used for the python shell that starts in my WING IDE.

So just curious about '-i'. :)

Thank you,
didata


On 09/03/2014 07:05 AM, Prashant Sharma wrote:
> Hey,
>
> You can use spark-shell -i sparkrc, to do this.
>
> Prashant Sharma
>
>
>
>
> On Wed, Sep 3, 2014 at 2:17 PM, Jianshi Huang <jianshi.huang@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     To make my shell experience merrier, I need to import several
>     packages, and define implicit sparkContext and sqlContext.
>
>     Is there a startup file (e.g. ~/.sparkrc) that Spark shell will
>     load when it's started?
>
>
>     Cheers,
>     -- 
>     Jianshi Huang
>
>     LinkedIn: jianshi
>     Twitter: @jshuang
>     Github & Blog: http://huangjs.github.com/
>
>

-- 
Dimension Data, LLC.
Sincerely yours,
Team Dimension Data
------------------------------------------------------------------------
Dimension Data, LLC. <https://www.didata.us> | www.didata.us 
<https://www.didata.us>
P: 212.882.1276| subscriptions@didata.us <ma...@didata.us>
Follow Us: https://www.LinkedIn.com/company/didata 
<https://www.LinkedIn.com/company/didata>

Dimension Data, LLC. <http://www.didata.us>
Data Analytics you can literally count on.


Re: .sparkrc for Spark shell?

Posted by Jianshi Huang <ji...@gmail.com>.
I se. Thanks Prashant!

Jianshi


On Wed, Sep 3, 2014 at 7:05 PM, Prashant Sharma <sc...@gmail.com>
wrote:

> Hey,
>
> You can use spark-shell -i sparkrc, to do this.
>
> Prashant Sharma
>
>
>
>
> On Wed, Sep 3, 2014 at 2:17 PM, Jianshi Huang <ji...@gmail.com>
> wrote:
>
>> To make my shell experience merrier, I need to import several packages,
>> and define implicit sparkContext and sqlContext.
>>
>> Is there a startup file (e.g. ~/.sparkrc) that Spark shell will load when
>> it's started?
>>
>>
>> Cheers,
>> --
>> Jianshi Huang
>>
>> LinkedIn: jianshi
>> Twitter: @jshuang
>> Github & Blog: http://huangjs.github.com/
>>
>
>


-- 
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/

Re: .sparkrc for Spark shell?

Posted by Prashant Sharma <sc...@gmail.com>.
Hey,

You can use spark-shell -i sparkrc, to do this.

Prashant Sharma




On Wed, Sep 3, 2014 at 2:17 PM, Jianshi Huang <ji...@gmail.com>
wrote:

> To make my shell experience merrier, I need to import several packages,
> and define implicit sparkContext and sqlContext.
>
> Is there a startup file (e.g. ~/.sparkrc) that Spark shell will load when
> it's started?
>
>
> Cheers,
> --
> Jianshi Huang
>
> LinkedIn: jianshi
> Twitter: @jshuang
> Github & Blog: http://huangjs.github.com/
>