You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Zalzberg, Idan (Agoda)" <Id...@agoda.com> on 2015/03/01 16:03:20 UTC

unsafe memory access in spark 1.2.1

Hi,
I am using spark 1.2.1, sometimes I get these errors sporadically:
Any thought on what could be the cause?

Thanks

2015-02-27 15:08:47 ERROR SparkUncaughtExceptionHandler:96 - Uncaught exception in thread Thread[Executor task launch worker-25,5,main]
java.lang.InternalError: a fault occurred in a recent unsafe memory access operation in compiled Java code
                    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1377)
                    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
                    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
                    at org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:133)
                    at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
                    at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:350)
                    at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at org.apache.spark.util.collection.ExternalSorter.spillToPartitionFiles(ExternalSorter.scala:365)
                    at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:211)
                    at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
                    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
                    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
                    at org.apache.spark.scheduler.Task.run(Task.scala:56)
                    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
                    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
                    at java.lang.Thread.run(Thread.java:745)


________________________________
This message is confidential and is for the sole use of the intended recipient(s). It may also be privileged or otherwise protected by copyright or other legal rules. If you have received it by mistake please let us know by reply email and delete it from your system. It is prohibited to copy this message or disclose its content to anyone. Any confidentiality or privilege is not waived or lost by any mistaken delivery or unauthorized disclosure of the message. All messages sent to and from Agoda may be monitored to ensure compliance with company policies, to protect the company's interests and to remove potential malware. Electronic messages may be intercepted, amended, lost or deleted, or contain viruses.

Re: unsafe memory access in spark 1.2.1

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Not sure, but It could be related to th netty off heap access as described
here https://issues.apache.org/jira/browse/SPARK-4516, but the message was
different though.

Thanks
Best Regards

On Mon, Mar 2, 2015 at 12:51 AM, Zalzberg, Idan (Agoda) <
Idan.Zalzberg@agoda.com> wrote:

>  Thanks,
>
> We monitor disk space so I doubt that is it, but I will check again
>
>
>
>
>
> *From:* Ted Yu [mailto:yuzhihong@gmail.com]
> *Sent:* Sunday, March 01, 2015 11:45 PM
>
> *To:* Zalzberg, Idan (Agoda)
> *Cc:* user@spark.apache.org
> *Subject:* Re: unsafe memory access in spark 1.2.1
>
>
>
> Google led me to:
>
> https://bugs.openjdk.java.net/browse/JDK-8040802
>
>
>
> Not sure if the last comment there applies to your deployment.
>
>
>
> On Sun, Mar 1, 2015 at 8:32 AM, Zalzberg, Idan (Agoda) <
> Idan.Zalzberg@agoda.com> wrote:
>
>  My run time version is:
>
>
>
> java version "1.7.0_75"
>
> OpenJDK Runtime Environment (rhel-2.5.4.0.el6_6-x86_64 u75-b13)
>
> OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)
>
>
>
> Thanks
>
>
>
> *From:* Ted Yu [mailto:yuzhihong@gmail.com]
> *Sent:* Sunday, March 01, 2015 10:18 PM
> *To:* Zalzberg, Idan (Agoda)
> *Cc:* user@spark.apache.org
> *Subject:* Re: unsafe memory access in spark 1.2.1
>
>
>
> What Java version are you using ?
>
>
>
> Thanks
>
>
>
> On Sun, Mar 1, 2015 at 7:03 AM, Zalzberg, Idan (Agoda) <
> Idan.Zalzberg@agoda.com> wrote:
>
>  Hi,
>
> I am using spark 1.2.1, sometimes I get these errors sporadically:
>
> Any thought on what could be the cause?
>
> Thanks
>
>
>
> 2015-02-27 15:08:47 ERROR SparkUncaughtExceptionHandler:96 - Uncaught
> exception in thread Thread[Executor task launch worker-25,5,main]
>
> java.lang.InternalError: a fault occurred in a recent unsafe memory access
> operation in compiled Java code
>
>                     at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1377)
>
>                     at
> java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>
>                     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>
>                     at
> org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:133)
>
>                     at
> org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
>
>                     at
> scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:350)
>
>                     at
> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> org.apache.spark.util.collection.ExternalSorter.spillToPartitionFiles(ExternalSorter.scala:365)
>
>                     at
> org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:211)
>
>                     at
> org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
>
>                     at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
>
>                     at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
>
>                     at org.apache.spark.scheduler.Task.run(Task.scala:56)
>
>                     at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
>
>                     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>
>                     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>
>                     at java.lang.Thread.run(Thread.java:745)
>
>
>
>
>  ------------------------------
>  This message is confidential and is for the sole use of the intended
> recipient(s). It may also be privileged or otherwise protected by copyright
> or other legal rules. If you have received it by mistake please let us know
> by reply email and delete it from your system. It is prohibited to copy
> this message or disclose its content to anyone. Any confidentiality or
> privilege is not waived or lost by any mistaken delivery or unauthorized
> disclosure of the message. All messages sent to and from Agoda may be
> monitored to ensure compliance with company policies, to protect the
> company's interests and to remove potential malware. Electronic messages
> may be intercepted, amended, lost or deleted, or contain viruses.
>
>
>
>
>

RE: unsafe memory access in spark 1.2.1

Posted by "Zalzberg, Idan (Agoda)" <Id...@agoda.com>.
Thanks,
We monitor disk space so I doubt that is it, but I will check again


From: Ted Yu [mailto:yuzhihong@gmail.com]
Sent: Sunday, March 01, 2015 11:45 PM
To: Zalzberg, Idan (Agoda)
Cc: user@spark.apache.org
Subject: Re: unsafe memory access in spark 1.2.1

Google led me to:
https://bugs.openjdk.java.net/browse/JDK-8040802

Not sure if the last comment there applies to your deployment.

On Sun, Mar 1, 2015 at 8:32 AM, Zalzberg, Idan (Agoda) <Id...@agoda.com>> wrote:
My run time version is:

java version "1.7.0_75"
OpenJDK Runtime Environment (rhel-2.5.4.0.el6_6-x86_64 u75-b13)
OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)

Thanks

From: Ted Yu [mailto:yuzhihong@gmail.com<ma...@gmail.com>]
Sent: Sunday, March 01, 2015 10:18 PM
To: Zalzberg, Idan (Agoda)
Cc: user@spark.apache.org<ma...@spark.apache.org>
Subject: Re: unsafe memory access in spark 1.2.1

What Java version are you using ?

Thanks

On Sun, Mar 1, 2015 at 7:03 AM, Zalzberg, Idan (Agoda) <Id...@agoda.com>> wrote:
Hi,
I am using spark 1.2.1, sometimes I get these errors sporadically:
Any thought on what could be the cause?

Thanks

2015-02-27 15:08:47 ERROR SparkUncaughtExceptionHandler:96 - Uncaught exception in thread Thread[Executor task launch worker-25,5,main]
java.lang.InternalError: a fault occurred in a recent unsafe memory access operation in compiled Java code
                    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1377)
                    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
                    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
                    at org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:133)
                    at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
                    at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:350)
                    at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at org.apache.spark.util.collection.ExternalSorter.spillToPartitionFiles(ExternalSorter.scala:365)
                    at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:211)
                    at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
                    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
                    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
                    at org.apache.spark.scheduler.Task.run(Task.scala:56)
                    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
                    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
                    at java.lang.Thread.run(Thread.java:745)


________________________________
This message is confidential and is for the sole use of the intended recipient(s). It may also be privileged or otherwise protected by copyright or other legal rules. If you have received it by mistake please let us know by reply email and delete it from your system. It is prohibited to copy this message or disclose its content to anyone. Any confidentiality or privilege is not waived or lost by any mistaken delivery or unauthorized disclosure of the message. All messages sent to and from Agoda may be monitored to ensure compliance with company policies, to protect the company's interests and to remove potential malware. Electronic messages may be intercepted, amended, lost or deleted, or contain viruses.



Re: unsafe memory access in spark 1.2.1

Posted by Ted Yu <yu...@gmail.com>.
Google led me to:
https://bugs.openjdk.java.net/browse/JDK-8040802

Not sure if the last comment there applies to your deployment.

On Sun, Mar 1, 2015 at 8:32 AM, Zalzberg, Idan (Agoda) <
Idan.Zalzberg@agoda.com> wrote:

>  My run time version is:
>
>
>
> java version "1.7.0_75"
>
> OpenJDK Runtime Environment (rhel-2.5.4.0.el6_6-x86_64 u75-b13)
>
> OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)
>
>
>
> Thanks
>
>
>
> *From:* Ted Yu [mailto:yuzhihong@gmail.com]
> *Sent:* Sunday, March 01, 2015 10:18 PM
> *To:* Zalzberg, Idan (Agoda)
> *Cc:* user@spark.apache.org
> *Subject:* Re: unsafe memory access in spark 1.2.1
>
>
>
> What Java version are you using ?
>
>
>
> Thanks
>
>
>
> On Sun, Mar 1, 2015 at 7:03 AM, Zalzberg, Idan (Agoda) <
> Idan.Zalzberg@agoda.com> wrote:
>
>  Hi,
>
> I am using spark 1.2.1, sometimes I get these errors sporadically:
>
> Any thought on what could be the cause?
>
> Thanks
>
>
>
> 2015-02-27 15:08:47 ERROR SparkUncaughtExceptionHandler:96 - Uncaught
> exception in thread Thread[Executor task launch worker-25,5,main]
>
> java.lang.InternalError: a fault occurred in a recent unsafe memory access
> operation in compiled Java code
>
>                     at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1377)
>
>                     at
> java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>
>                     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>
>                     at
> org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:133)
>
>                     at
> org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
>
>                     at
> scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:350)
>
>                     at
> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> org.apache.spark.util.collection.ExternalSorter.spillToPartitionFiles(ExternalSorter.scala:365)
>
>                     at
> org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:211)
>
>                     at
> org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
>
>                     at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
>
>                     at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
>
>                     at org.apache.spark.scheduler.Task.run(Task.scala:56)
>
>                     at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
>
>                     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>
>                     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>
>                     at java.lang.Thread.run(Thread.java:745)
>
>
>
>
>  ------------------------------
>  This message is confidential and is for the sole use of the intended
> recipient(s). It may also be privileged or otherwise protected by copyright
> or other legal rules. If you have received it by mistake please let us know
> by reply email and delete it from your system. It is prohibited to copy
> this message or disclose its content to anyone. Any confidentiality or
> privilege is not waived or lost by any mistaken delivery or unauthorized
> disclosure of the message. All messages sent to and from Agoda may be
> monitored to ensure compliance with company policies, to protect the
> company's interests and to remove potential malware. Electronic messages
> may be intercepted, amended, lost or deleted, or contain viruses.
>
>
>

RE: unsafe memory access in spark 1.2.1

Posted by "Zalzberg, Idan (Agoda)" <Id...@agoda.com>.
My run time version is:

java version "1.7.0_75"
OpenJDK Runtime Environment (rhel-2.5.4.0.el6_6-x86_64 u75-b13)
OpenJDK 64-Bit Server VM (build 24.75-b04, mixed mode)

Thanks

From: Ted Yu [mailto:yuzhihong@gmail.com]
Sent: Sunday, March 01, 2015 10:18 PM
To: Zalzberg, Idan (Agoda)
Cc: user@spark.apache.org
Subject: Re: unsafe memory access in spark 1.2.1

What Java version are you using ?

Thanks

On Sun, Mar 1, 2015 at 7:03 AM, Zalzberg, Idan (Agoda) <Id...@agoda.com>> wrote:
Hi,
I am using spark 1.2.1, sometimes I get these errors sporadically:
Any thought on what could be the cause?

Thanks

2015-02-27 15:08:47 ERROR SparkUncaughtExceptionHandler:96 - Uncaught exception in thread Thread[Executor task launch worker-25,5,main]
java.lang.InternalError: a fault occurred in a recent unsafe memory access operation in compiled Java code
                    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1377)
                    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
                    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
                    at org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:133)
                    at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
                    at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:350)
                    at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
                    at org.apache.spark.util.collection.ExternalSorter.spillToPartitionFiles(ExternalSorter.scala:365)
                    at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:211)
                    at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
                    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
                    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
                    at org.apache.spark.scheduler.Task.run(Task.scala:56)
                    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
                    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
                    at java.lang.Thread.run(Thread.java:745)


________________________________
This message is confidential and is for the sole use of the intended recipient(s). It may also be privileged or otherwise protected by copyright or other legal rules. If you have received it by mistake please let us know by reply email and delete it from your system. It is prohibited to copy this message or disclose its content to anyone. Any confidentiality or privilege is not waived or lost by any mistaken delivery or unauthorized disclosure of the message. All messages sent to and from Agoda may be monitored to ensure compliance with company policies, to protect the company's interests and to remove potential malware. Electronic messages may be intercepted, amended, lost or deleted, or contain viruses.


Re: unsafe memory access in spark 1.2.1

Posted by Ted Yu <yu...@gmail.com>.
What Java version are you using ?

Thanks

On Sun, Mar 1, 2015 at 7:03 AM, Zalzberg, Idan (Agoda) <
Idan.Zalzberg@agoda.com> wrote:

>  Hi,
>
> I am using spark 1.2.1, sometimes I get these errors sporadically:
>
> Any thought on what could be the cause?
>
> Thanks
>
>
>
> 2015-02-27 15:08:47 ERROR SparkUncaughtExceptionHandler:96 - Uncaught
> exception in thread Thread[Executor task launch worker-25,5,main]
>
> java.lang.InternalError: a fault occurred in a recent unsafe memory access
> operation in compiled Java code
>
>                     at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1377)
>
>                     at
> java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>
>                     at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
>
>                     at
> org.apache.spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:133)
>
>                     at
> org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:71)
>
>                     at
> scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:350)
>
>                     at
> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
>
>                     at
> org.apache.spark.util.collection.ExternalSorter.spillToPartitionFiles(ExternalSorter.scala:365)
>
>                     at
> org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:211)
>
>                     at
> org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:63)
>
>                     at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
>
>                     at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
>
>                     at org.apache.spark.scheduler.Task.run(Task.scala:56)
>
>                     at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
>
>                     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>
>                     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>
>                     at java.lang.Thread.run(Thread.java:745)
>
>
>
> ------------------------------
> This message is confidential and is for the sole use of the intended
> recipient(s). It may also be privileged or otherwise protected by copyright
> or other legal rules. If you have received it by mistake please let us know
> by reply email and delete it from your system. It is prohibited to copy
> this message or disclose its content to anyone. Any confidentiality or
> privilege is not waived or lost by any mistaken delivery or unauthorized
> disclosure of the message. All messages sent to and from Agoda may be
> monitored to ensure compliance with company policies, to protect the
> company's interests and to remove potential malware. Electronic messages
> may be intercepted, amended, lost or deleted, or contain viruses.
>