You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@predictionio.apache.org by "Guruju, Lakshmi Sravya" <la...@sap.com> on 2016/10/24 04:23:28 UTC

Stack overflow exception with Similar Product engine

Hi,

I am using PredictionIo Similar Product engine. I was able to build the engine successfully on my MAC machine. Also train the data successfully with the data size of 50 user events, 50 item events and 50 view events.

But when I import data around 5000 user events, 5000 item events, and 5000 view events and try to train the same, I am getting Stack overflow exception. Just FYI, am using Java SDK to import the data as a batch of 50 events at a time.  Below is the part of exception trace.

Also I found something related to this over stack overflow (http://stackoverflow.com/questions/31484460/spark-gives-a-stackoverflowerror-when-training-using-als ), But as am new to spark, I did not get where to provide the fix.
Could somebody help me in resolving it.

[ERROR] [Executor] Exception in task 1.0 in stage 692.0 (TID 540)
[ERROR] [Executor] Exception in task 0.0 in stage 692.0 (TID 539)
[ERROR] [Executor] Exception in task 3.0 in stage 692.0 (TID 542)
[WARN] [TaskSetManager] Lost task 0.0 in stage 692.0 (TID 539, localhost): java.lang.StackOverflowError
at java.io.ObjectInputStream$PeekInputStream.peek(ObjectInputStream.java:2296)
at java.io.ObjectInputStream$BlockDataInputStream.peek(ObjectInputStream.java:2589)
at java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2599)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1506)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)

Regards,
Sravya

RE: Stack overflow exception with Similar Product engine

Posted by "Guruju, Lakshmi Sravya" <la...@sap.com>.
Hi Mars,

Thank you so much. It worked now.

Regards,
Sravya

From: Mars Hall [mailto:mars@heroku.com]
Sent: Tuesday, October 25, 2016 1:32 AM
To: user@predictionio.incubator.apache.org
Subject: Re: Stack overflow exception with Similar Product engine

Lakshmi,

Everytime my team encounters a stack overflow with PIO/Spark, it has been resolved by allocating more memory. This setting is based on your runtime RAM. Add opts like the following to the pio commands:

pio train -- --driver-memory 1g --executor-memory 4g

Make sure the extra double-dash is between the pio opts & spark opts as shown.

*Mars

On Sun, Oct 23, 2016 at 21:24 Guruju, Lakshmi Sravya <la...@sap.com>> wrote:
Hi,

I am using PredictionIo Similar Product engine. I was able to build the engine successfully on my MAC machine. Also train the data successfully with the data size of 50 user events, 50 item events and 50 view events.

But when I import data around 5000 user events, 5000 item events, and 5000 view events and try to train the same, I am getting Stack overflow exception. Just FYI, am using Java SDK to import the data as a batch of 50 events at a time.  Below is the part of exception trace.

Also I found something related to this over stack overflow (http://stackoverflow.com/questions/31484460/spark-gives-a-stackoverflowerror-when-training-using-als ), But as am new to spark, I did not get where to provide the fix.
Could somebody help me in resolving it.

[ERROR] [Executor] Exception in task 1.0 in stage 692.0 (TID 540)
[ERROR] [Executor] Exception in task 0.0 in stage 692.0 (TID 539)
[ERROR] [Executor] Exception in task 3.0 in stage 692.0 (TID 542)
[WARN] [TaskSetManager] Lost task 0.0 in stage 692.0 (TID 539, localhost): java.lang.StackOverflowError
at java.io.ObjectInputStream$PeekInputStream.peek(ObjectInputStream.java:2296)
at java.io.ObjectInputStream$BlockDataInputStream.peek(ObjectInputStream.java:2589)
at java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2599)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1506)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)

Regards,
Sravya

Re: Stack overflow exception with Similar Product engine

Posted by Mars Hall <ma...@heroku.com>.
Lakshmi,

Everytime my team encounters a stack overflow with PIO/Spark, it has been resolved by allocating more memory. This setting is based on your runtime RAM. Add opts like the following to the pio commands:

pio train -- --driver-memory 1g --executor-memory 4g

Make sure the extra double-dash is between the pio opts & spark opts as shown.

*Mars

On Sun, Oct 23, 2016 at 21:24 Guruju, Lakshmi Sravya <lakshmi.sravya.guruju@sap.com <ma...@sap.com>> wrote:
Hi,

 

I am using PredictionIo Similar Product engine. I was able to build the engine successfully on my MAC machine. Also train the data successfully with the data size of 50 user events, 50 item events and 50 view events.

 

But when I import data around 5000 user events, 5000 item events, and 5000 view events and try to train the same, I am getting Stack overflow exception. Just FYI, am using Java SDK to import the data as a batch of 50 events at a time.  Below is the part of exception trace.

 

Also I found something related to this over stack overflow (http://stackoverflow.com/questions/31484460/spark-gives-a-stackoverflowerror-when-training-using-als <http://stackoverflow.com/questions/31484460/spark-gives-a-stackoverflowerror-when-training-using-als> ), But as am new to spark, I did not get where to provide the fix.

Could somebody help me in resolving it.

 

[ERROR] [Executor] Exception in task 1.0 in stage 692.0 (TID 540)
[ERROR] [Executor] Exception in task 0.0 in stage 692.0 (TID 539)
[ERROR] [Executor] Exception in task 3.0 in stage 692.0 (TID 542)
[WARN] [TaskSetManager] Lost task 0.0 in stage 692.0 (TID 539, localhost): java.lang.StackOverflowError
at java.io.ObjectInputStream$PeekInputStream.peek(ObjectInputStream.java:2296)
at java.io.ObjectInputStream$BlockDataInputStream.peek(ObjectInputStream.java:2589)
at java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2599)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1506)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)
 

Regards,

Sravya