You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by "Jagadeesan A.S." <li...@gmail.com> on 2015/12/07 14:55:40 UTC

java.lang.OutOfMemoryError: Java heap space

Hi dev,

We are testing spark performance on Spark-perf. While generating output for
python_mllib-perf we are getting following issue.

https://github.com/databricks/spark-perf/issues/92

Max. Heap Size (Estimated): 8.00G
------------------------------

Following changes we made in spark-perf-master/config/config.py file.

SPARK_HOME_DIR = "/home/test/spark-1.5.1"
RUN_SPARK_TESTS = False
RUN_PYSPARK_TESTS =False
RUN_STREAMING_TESTS =False
RUN_MLLIB_TESTS = False
RUN_PYTHON_MLLIB_TESTS = True

PREP_SPARK_TESTS = False
PREP_PYSPARK_TESTS = False
PREP_STREAMING_TESTS =False
PREP_MLLIB_TESTS = False

COMMON_JAVA_OPTS = [ JavaOptionSet("spark.storage.memoryFraction", [0.66]),
JavaOptionSet("spark.serializer",
["org.apache.spark.serializer.JavaSerializer"]),
JavaOptionSet("spark.executor.memory", ["16g"]),
JavaOptionSet("spark.locality.wait", [str(60 * 1000 * 1000)]) ]

SPARK_DRIVER_MEMORY = "4g"
MLLIB_SPARK_VERSION = 1.5


Thanks & Regards

Jagadeesan A S