You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by richiesgr <ri...@gmail.com> on 2014/09/14 13:41:02 UTC

Driver fail with out of memory exception

Hi

I've written a job (I think not very complicated only 1 reduceByKey) the
driver JVM always hang with OOM killing the worker of course. How can I know
what is running on the driver and what is running on the worker how to debug
the memory problem.
I've already used --driver-memory 4g params to give more memory ut nothing
help it always fail

Thanks



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Driver-fail-with-out-of-memory-exception-tp14188.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Driver fail with out of memory exception

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Try increasing the number of partitions while doing a reduceByKey()
<http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.api.java.JavaPairRDD>

Thanks
Best Regards

On Sun, Sep 14, 2014 at 5:11 PM, richiesgr <ri...@gmail.com> wrote:

> Hi
>
> I've written a job (I think not very complicated only 1 reduceByKey) the
> driver JVM always hang with OOM killing the worker of course. How can I
> know
> what is running on the driver and what is running on the worker how to
> debug
> the memory problem.
> I've already used --driver-memory 4g params to give more memory ut nothing
> help it always fail
>
> Thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Driver-fail-with-out-of-memory-exception-tp14188.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>