You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Greg Hill <gr...@RACKSPACE.COM> on 2014/09/23 17:06:43 UTC

recommended values for spark driver memory?

I know the recommendation is "it depends", but can people share what sort of memory allocations they're using for their driver processes?  I'd like to get an idea of what the range looks like so we can provide sensible defaults without necessarily knowing what the jobs will look like.  The customer can then tweak that if they need to for their particular job.

Thanks in advance.

Greg