You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sai Prasanna <an...@gmail.com> on 2014/05/19 09:41:26 UTC

persist @ disk-only failing

Hi all,

When i gave the persist level as DISK_ONLY, still Spark tries to use memory
and caches.
Any reason ?
Do i need to override some parameter elsewhere ?

Thanks !

Re: persist @ disk-only failing

Posted by Sai Prasanna <an...@gmail.com>.
Ok Thanks!


On Mon, May 19, 2014 at 10:09 PM, Matei Zaharia <ma...@gmail.com>wrote:

> This is the patch for it: https://github.com/apache/spark/pull/50/. It
> might be possible to backport it to 0.8.
>
> Matei
>
> On May 19, 2014, at 2:04 AM, Sai Prasanna <an...@gmail.com> wrote:
>
> Matei, I am using 0.8.1 !!
>
> But is there a way without moving to 0.9.1 to bypass cache ?
>
>
> On Mon, May 19, 2014 at 1:31 PM, Matei Zaharia <ma...@gmail.com>wrote:
>
>> What version is this with? We used to build each partition first before
>> writing it out, but this was fixed a while back (0.9.1, but it may also be
>> in 0.9.0).
>>
>> Matei
>>
>> On May 19, 2014, at 12:41 AM, Sai Prasanna <an...@gmail.com>
>> wrote:
>>
>> > Hi all,
>> >
>> > When i gave the persist level as DISK_ONLY, still Spark tries to use
>> memory and caches.
>> > Any reason ?
>> > Do i need to override some parameter elsewhere ?
>> >
>> > Thanks !
>>
>>
>
>

Re: persist @ disk-only failing

Posted by Matei Zaharia <ma...@gmail.com>.
This is the patch for it: https://github.com/apache/spark/pull/50/. It might be possible to backport it to 0.8.

Matei

On May 19, 2014, at 2:04 AM, Sai Prasanna <an...@gmail.com> wrote:

> Matei, I am using 0.8.1 !!
> 
> But is there a way without moving to 0.9.1 to bypass cache ?
> 
> 
> On Mon, May 19, 2014 at 1:31 PM, Matei Zaharia <ma...@gmail.com> wrote:
> What version is this with? We used to build each partition first before writing it out, but this was fixed a while back (0.9.1, but it may also be in 0.9.0).
> 
> Matei
> 
> On May 19, 2014, at 12:41 AM, Sai Prasanna <an...@gmail.com> wrote:
> 
> > Hi all,
> >
> > When i gave the persist level as DISK_ONLY, still Spark tries to use memory and caches.
> > Any reason ?
> > Do i need to override some parameter elsewhere ?
> >
> > Thanks !
> 
> 


Re: persist @ disk-only failing

Posted by Sai Prasanna <an...@gmail.com>.
Matei, I am using 0.8.1 !!

But is there a way without moving to 0.9.1 to bypass cache ?


On Mon, May 19, 2014 at 1:31 PM, Matei Zaharia <ma...@gmail.com>wrote:

> What version is this with? We used to build each partition first before
> writing it out, but this was fixed a while back (0.9.1, but it may also be
> in 0.9.0).
>
> Matei
>
> On May 19, 2014, at 12:41 AM, Sai Prasanna <an...@gmail.com>
> wrote:
>
> > Hi all,
> >
> > When i gave the persist level as DISK_ONLY, still Spark tries to use
> memory and caches.
> > Any reason ?
> > Do i need to override some parameter elsewhere ?
> >
> > Thanks !
>
>

Re: persist @ disk-only failing

Posted by Matei Zaharia <ma...@gmail.com>.
What version is this with? We used to build each partition first before writing it out, but this was fixed a while back (0.9.1, but it may also be in 0.9.0).

Matei

On May 19, 2014, at 12:41 AM, Sai Prasanna <an...@gmail.com> wrote:

> Hi all,
> 
> When i gave the persist level as DISK_ONLY, still Spark tries to use memory and caches.
> Any reason ?
> Do i need to override some parameter elsewhere ?
> 
> Thanks !