You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ted Yu <yu...@gmail.com> on 2016/05/12 21:40:00 UTC

Re: How to get and save core dump of native library in executors

Which OS are you using ?

See http://en.linuxreviews.org/HOWTO_enable_core-dumps

On Thu, May 12, 2016 at 2:23 PM, prateek arora <pr...@gmail.com>
wrote:

> Hi
>
> I am running my spark application with some third party native libraries .
> but it crashes some time and show error " Failed to write core dump. Core
> dumps have been disabled. To enable core dumping, try "ulimit -c unlimited"
> before starting Java again " .
>
> Below are the log :
>
>  A fatal error has been detected by the Java Runtime Environment:
> #
> #  SIGSEGV (0xb) at pc=0x00007fd44b491fb9, pid=20458, tid=140549318547200
> #
> # JRE version: Java(TM) SE Runtime Environment (7.0_67-b01) (build
> 1.7.0_67-b01)
> # Java VM: Java HotSpot(TM) 64-Bit Server VM (24.65-b04 mixed mode
> linux-amd64 compressed oops)
> # Problematic frame:
> # V  [libjvm.so+0x650fb9]  jni_SetByteArrayRegion+0xa9
> #
> # Failed to write core dump. Core dumps have been disabled. To enable core
> dumping, try "ulimit -c unlimited" before starting Java again
> #
> # An error report file with more information is saved as:
> #
>
> /yarn/nm/usercache/master/appcache/application_1462930975871_0004/container_1462930975871_0004_01_000066/hs_err_pid20458.log
> #
> # If you would like to submit a bug report, please visit:
> #   http://bugreport.sun.com/bugreport/crash.jsp
> #
>
>
> so how can i enable core dump and save it some place ?
>
> Regards
> Prateek
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-get-and-save-core-dump-of-native-library-in-executors-tp26945.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: How to get and save core dump of native library in executors

Posted by prateek arora <pr...@gmail.com>.
ubuntu 14.04

On Thu, May 12, 2016 at 2:40 PM, Ted Yu <yu...@gmail.com> wrote:

> Which OS are you using ?
>
> See http://en.linuxreviews.org/HOWTO_enable_core-dumps
>
> On Thu, May 12, 2016 at 2:23 PM, prateek arora <prateek.arora2k6@gmail.com
> > wrote:
>
>> Hi
>>
>> I am running my spark application with some third party native libraries .
>> but it crashes some time and show error " Failed to write core dump. Core
>> dumps have been disabled. To enable core dumping, try "ulimit -c
>> unlimited"
>> before starting Java again " .
>>
>> Below are the log :
>>
>>  A fatal error has been detected by the Java Runtime Environment:
>> #
>> #  SIGSEGV (0xb) at pc=0x00007fd44b491fb9, pid=20458, tid=140549318547200
>> #
>> # JRE version: Java(TM) SE Runtime Environment (7.0_67-b01) (build
>> 1.7.0_67-b01)
>> # Java VM: Java HotSpot(TM) 64-Bit Server VM (24.65-b04 mixed mode
>> linux-amd64 compressed oops)
>> # Problematic frame:
>> # V  [libjvm.so+0x650fb9]  jni_SetByteArrayRegion+0xa9
>> #
>> # Failed to write core dump. Core dumps have been disabled. To enable core
>> dumping, try "ulimit -c unlimited" before starting Java again
>> #
>> # An error report file with more information is saved as:
>> #
>>
>> /yarn/nm/usercache/master/appcache/application_1462930975871_0004/container_1462930975871_0004_01_000066/hs_err_pid20458.log
>> #
>> # If you would like to submit a bug report, please visit:
>> #   http://bugreport.sun.com/bugreport/crash.jsp
>> #
>>
>>
>> so how can i enable core dump and save it some place ?
>>
>> Regards
>> Prateek
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-get-and-save-core-dump-of-native-library-in-executors-tp26945.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>