You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by goi cto <go...@gmail.com> on 2014/03/16 22:38:45 UTC

Running Spark on a single machine

Hi,

I know it is probably not the purpose of spark but the syntax is easy and
cool...
I need to run some spark like code in memory on a single machine any
pointers how to optimize it to run only on one machine?


-- 
Eran | CTO

Re: Running Spark on a single machine

Posted by Ewen Cheslack-Postava <me...@ewencp.org>.
Those pages include instructions for running locally:

"Note that all of the sample programs take a |<master>| parameter 
specifying the cluster URL to connect to. This can be a URL for a 
distributed cluster 
<http://spark.apache.org/docs/latest/scala-programming-guide.html#master-urls>, 
or |local| to run locally with one thread, or |local[N]| to run locally 
with N threads. You should start by using |local| for testing."

Pass "local[N]" as the master and it'll run locally with N threads. 
Alternatively, set up a standalone "cluster" as you normally would, just 
run the master and 1 slave locally.

-Ewen

> goi cto <ma...@gmail.com>
> March 16, 2014 at 11:42 PM
> Sorry, I did not explain myself correctly.
> I know how to run spark, the question is how to instruct spark to do 
> all of the computation on a single machine?
> I was trying to convert the code to scala but I miss some of the 
> methods of spark like reduceByKey
>
> Eran
>
>
>
>
>
> -- 
> Eran | CTO
> Nick Pentreath <ma...@gmail.com>
> March 16, 2014 at 10:25 PM
> Please follow the instructions at 
> http://spark.apache.org/docs/latest/index.html and 
> http://spark.apache.org/docs/latest/quick-start.html to get started on 
> a local machine.
>
>
> —
> Sent from Mailbox <https://www.dropbox.com/mailbox> for iPhone
>
>
>
> goi cto <ma...@gmail.com>
> March 16, 2014 at 2:38 PM
> Hi,
>
> I know it is probably not the purpose of spark but the syntax is easy 
> and cool...
> I need to run some spark like code in memory on a single machine any 
> pointers how to optimize it to run only on one machine?
>
>
> -- 
> Eran | CTO

Re: Running Spark on a single machine

Posted by goi cto <go...@gmail.com>.
Sorry, I did not explain myself correctly.
I know how to run spark, the question is how to instruct spark to do all of
the computation on a single machine?
I was trying to convert the code to scala but I miss some of the methods of
spark like reduceByKey

Eran


On Mon, Mar 17, 2014 at 7:25 AM, Nick Pentreath <ni...@gmail.com>wrote:

> Please follow the instructions at
> http://spark.apache.org/docs/latest/index.html and
> http://spark.apache.org/docs/latest/quick-start.html to get started on a
> local machine.
>
>
> —
> Sent from Mailbox <https://www.dropbox.com/mailbox> for iPhone
>
>
> On Sun, Mar 16, 2014 at 11:39 PM, goi cto <go...@gmail.com> wrote:
>
>> Hi,
>>
>> I know it is probably not the purpose of spark but the syntax is easy and
>> cool...
>> I need to run some spark like code in memory on a single machine any
>> pointers how to optimize it to run only on one machine?
>>
>>
>> --
>> Eran | CTO
>>
>
>


-- 
Eran | CTO

Re: Running Spark on a single machine

Posted by Nick Pentreath <ni...@gmail.com>.
Please follow the instructions at http://spark.apache.org/docs/latest/index.html and http://spark.apache.org/docs/latest/quick-start.html to get started on a local machine.




—
Sent from Mailbox for iPhone

On Sun, Mar 16, 2014 at 11:39 PM, goi cto <go...@gmail.com> wrote:

> Hi,
> I know it is probably not the purpose of spark but the syntax is easy and
> cool...
> I need to run some spark like code in memory on a single machine any
> pointers how to optimize it to run only on one machine?
> -- 
> Eran | CTO