You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Isca Harmatz <po...@gmail.com> on 2015/06/16 06:45:35 UTC

Random Forest driver memory

hello,

i have noticed that the random forest implementation crashes when
to many trees/ to big maxDepth is used.

im guessing that this is something to do with the amount of nodes that need
to be
kept in driver's memory during the run.

but when i examined the nodes structure is seems rather small

does anyone now where does the memory issue come from?

thanks,
  Isca

Re: Random Forest driver memory

Posted by Joseph Bradley <jo...@databricks.com>.
Hi Isca,

Could you please give more details?  Data size, model parameters, stack
traces / logs, etc. to help get a better picture?

Thanks,
Joseph

On Wed, Jun 17, 2015 at 9:56 AM, Isca Harmatz <po...@gmail.com> wrote:

> hello,
>
> does anyone has any help on the issue?
>
>
> Isca
>
> On Tue, Jun 16, 2015 at 7:45 AM, Isca Harmatz <po...@gmail.com> wrote:
>
>> hello,
>>
>> i have noticed that the random forest implementation crashes when
>> to many trees/ to big maxDepth is used.
>>
>> im guessing that this is something to do with the amount of nodes that
>> need to be
>> kept in driver's memory during the run.
>>
>> but when i examined the nodes structure is seems rather small
>>
>> does anyone now where does the memory issue come from?
>>
>> thanks,
>>   Isca
>>
>
>

Re: Random Forest driver memory

Posted by Isca Harmatz <po...@gmail.com>.
hello,

does anyone has any help on the issue?


Isca

On Tue, Jun 16, 2015 at 7:45 AM, Isca Harmatz <po...@gmail.com> wrote:

> hello,
>
> i have noticed that the random forest implementation crashes when
> to many trees/ to big maxDepth is used.
>
> im guessing that this is something to do with the amount of nodes that
> need to be
> kept in driver's memory during the run.
>
> but when i examined the nodes structure is seems rather small
>
> does anyone now where does the memory issue come from?
>
> thanks,
>   Isca
>