You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Kali.tummala@gmail.com" <Ka...@gmail.com> on 2016/01/06 15:35:52 UTC
spark 1.6 Issue
Hi All,
I am running my app in IntelliJ Idea (locally) my config local[*] , the code
worked ok with spark 1.5 but when I upgraded to 1.6 I am having below issue.
is this a bug in 1.6 ? I change back to 1.5 it worked ok without any error
do I need to pass executor memory while running in local in spark 1.6 ?
Exception in thread "main" java.lang.IllegalArgumentException: System memory
259522560 must be at least 4.718592E8. Please use a larger heap size.
Thanks
Sri
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-Issue-tp25893.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
Re: spark 1.6 Issue
Posted by Sri <ka...@gmail.com>.
Hi Mark,
I did changes to VM options in edit configuration section for the main method and Scala test case class in IntelliJ which worked ok when I executed individually, but while running maven install to create jar file the test case is failing.
Can I add VM options in spark conf set in Scala test class hard coded way?
Thanks
Sri
Sent from my iPhone
> On 6 Jan 2016, at 17:43, Mark Hamstra <ma...@clearstorydata.com> wrote:
>
> It's not a bug, but a larger heap is required with the new UnifiedMemoryManager: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/memory/UnifiedMemoryManager.scala#L172
>
>> On Wed, Jan 6, 2016 at 6:35 AM, Kali.tummala@gmail.com <Ka...@gmail.com> wrote:
>> Hi All,
>>
>> I am running my app in IntelliJ Idea (locally) my config local[*] , the code
>> worked ok with spark 1.5 but when I upgraded to 1.6 I am having below issue.
>>
>> is this a bug in 1.6 ? I change back to 1.5 it worked ok without any error
>> do I need to pass executor memory while running in local in spark 1.6 ?
>>
>> Exception in thread "main" java.lang.IllegalArgumentException: System memory
>> 259522560 must be at least 4.718592E8. Please use a larger heap size.
>>
>> Thanks
>> Sri
>>
>>
>>
>> --
>> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-Issue-tp25893.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>
Re: spark 1.6 Issue
Posted by Mark Hamstra <ma...@clearstorydata.com>.
It's not a bug, but a larger heap is required with the new
UnifiedMemoryManager:
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/memory/UnifiedMemoryManager.scala#L172
On Wed, Jan 6, 2016 at 6:35 AM, Kali.tummala@gmail.com <
Kali.tummala@gmail.com> wrote:
> Hi All,
>
> I am running my app in IntelliJ Idea (locally) my config local[*] , the
> code
> worked ok with spark 1.5 but when I upgraded to 1.6 I am having below
> issue.
>
> is this a bug in 1.6 ? I change back to 1.5 it worked ok without any error
> do I need to pass executor memory while running in local in spark 1.6 ?
>
> Exception in thread "main" java.lang.IllegalArgumentException: System
> memory
> 259522560 must be at least 4.718592E8. Please use a larger heap size.
>
> Thanks
> Sri
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-Issue-tp25893.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>
Re: spark 1.6 Issue
Posted by "Kali.tummala@gmail.com" <Ka...@gmail.com>.
Hi All,
worked OK by adding below in VM options.
-Xms128m -Xmx512m -XX:MaxPermSize=300m -ea
Thanks
Sri
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-6-Issue-tp25893p25920.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org