You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by kant kodali <ka...@gmail.com> on 2016/10/28 19:47:20 UTC

java.lang.OutOfMemoryError: unable to create new native thread

 "dag-scheduler-event-loop" java.lang.OutOfMemoryError: unable to create
new native thread
        at java.lang.Thread.start0(Native Method)
        at java.lang.Thread.start(Thread.java:714)
        at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(
ForkJoinPool.java:1672)
        at scala.concurrent.forkjoin.ForkJoinPool.signalWork(
ForkJoinPool.java:1966)
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.push(
ForkJoinPool.java:1072)
        at scala.concurrent.forkjoin.ForkJoinTask.fork(
ForkJoinTask.java:654)
        at scala.collection.parallel.ForkJoinTasks$WrappedTask$

This is the error produced by the Spark Driver program which is running on
client mode by default so some people say just increase the heap size by
passing the --driver-memory 3g flag however the message *"**unable to
create new native thread**"*  really says that the JVM is asking OS to
create a new thread but OS couldn't allocate it anymore and the number of
threads a JVM can create by requesting OS is platform dependent but
typically it is 32K threads on a 64-bit JVM. so I am wondering why spark is
even creating so many threads and how do I control this number?

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by kant kodali <ka...@gmail.com>.
Another thing I forgot to mention is that it happens after running for
several hours say (4 to 5 hours) I am not sure why it is creating so many
threads? any way to control them?

On Fri, Oct 28, 2016 at 12:47 PM, kant kodali <ka...@gmail.com> wrote:

>  "dag-scheduler-event-loop" java.lang.OutOfMemoryError: unable to create
> new native thread
>         at java.lang.Thread.start0(Native Method)
>         at java.lang.Thread.start(Thread.java:714)
>         at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoin
> Pool.java:1672)
>         at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPo
> ol.java:1966)
>         at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.push(ForkJo
> inPool.java:1072)
>         at scala.concurrent.forkjoin.ForkJoinTask.fork(ForkJoinTask.
> java:654)
>         at scala.collection.parallel.ForkJoinTasks$WrappedTask$
>
> This is the error produced by the Spark Driver program which is running on
> client mode by default so some people say just increase the heap size by
> passing the --driver-memory 3g flag however the message *"**unable to
> create new native thread**"*  really says that the JVM is asking OS to
> create a new thread but OS couldn't allocate it anymore and the number of
> threads a JVM can create by requesting OS is platform dependent but
> typically it is 32K threads on a 64-bit JVM. so I am wondering why spark is
> even creating so many threads and how do I control this number?
>

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by kant kodali <ka...@gmail.com>.
Here is a UI of my thread dump.

http://fastthread.io/my-thread-report.jsp?p=c2hhcmVkLzIwMTYvMTEvMS8tLWpzdGFja19kdW1wX3dpbmRvd19pbnRlcnZhbF8xbWluX2JhdGNoX2ludGVydmFsXzFzLnR4dC0tNi0xNy00Ng==



On Mon, Oct 31, 2016 at 10:32 PM, kant kodali <ka...@gmail.com> wrote:

> Hi Vadim,
>
> Thank you so much this was a very useful command. This conversation is
> going on here
>
> https://www.mail-archive.com/user@spark.apache.org/msg58656.html
>
> or you can just google "
>
> why spark driver program is creating so many threads? How can I limit this
> number?
> <https://www.mail-archive.com/search?l=user@spark.apache.org&q=subject:%22Re%5C%3A+why+spark+driver+program+is+creating+so+many+threads%5C%3F+How+can+I+limit+this+number%5C%3F%22&o=newest>
> "
>
> please take a look if you are interested.
>
> Thanks a lot!
>
> On Mon, Oct 31, 2016 at 8:14 AM, Vadim Semenov <
> vadim.semenov@datadoghq.com> wrote:
>
>> Have you tried to get number of threads in a running process using `cat
>> /proc/<pid>/status` ?
>>
>> On Sun, Oct 30, 2016 at 11:04 PM, kant kodali <ka...@gmail.com> wrote:
>>
>>> yes I did run ps -ef | grep "app_name" and it is root.
>>>
>>>
>>>
>>> On Sun, Oct 30, 2016 at 8:00 PM, Chan Chor Pang <ch...@indetail.co.jp>
>>> wrote:
>>>
>>>> sorry, the UID
>>>>
>>>> On 10/31/16 11:59 AM, Chan Chor Pang wrote:
>>>>
>>>> actually if the max user processes is not the problem, i have no idea
>>>>
>>>> but i still suspecting the user,
>>>> as the user who run spark-submit is not necessary the pid for the JVM
>>>> process
>>>>
>>>> can u make sure when you "ps -ef | grep {your app id} " the PID is root?
>>>> On 10/31/16 11:21 AM, kant kodali wrote:
>>>>
>>>> The java process is run by the root and it has the same config
>>>>
>>>> sudo -i
>>>>
>>>> ulimit -a
>>>>
>>>> core file size          (blocks, -c) 0
>>>> data seg size           (kbytes, -d) unlimited
>>>> scheduling priority             (-e) 0
>>>> file size               (blocks, -f) unlimited
>>>> pending signals                 (-i) 120242
>>>> max locked memory       (kbytes, -l) 64
>>>> max memory size         (kbytes, -m) unlimited
>>>> open files                      (-n) 1024
>>>> pipe size            (512 bytes, -p) 8
>>>> POSIX message queues     (bytes, -q) 819200
>>>> real-time priority              (-r) 0
>>>> stack size              (kbytes, -s) 8192
>>>> cpu time               (seconds, -t) unlimited
>>>> max user processes              (-u) 120242
>>>> virtual memory          (kbytes, -v) unlimited
>>>> file locks                      (-x) unlimited
>>>>
>>>>
>>>>
>>>> On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang <chin-sh@indetail.co.jp
>>>> > wrote:
>>>>
>>>>> I have the same Exception before and the problem fix after i change
>>>>> the nproc conf.
>>>>>
>>>>> > max user processes              (-u) 120242
>>>>> ↑this config does looks good.
>>>>> are u sure the user who run ulimit -a is the same user who run the
>>>>> Java process?
>>>>> depend on how u submit the job and your setting, spark job may execute
>>>>> by other user.
>>>>>
>>>>>
>>>>> On 10/31/16 10:38 AM, kant kodali wrote:
>>>>>
>>>>> when I did this
>>>>>
>>>>> cat /proc/sys/kernel/pid_max
>>>>>
>>>>> I got 32768
>>>>>
>>>>> On Sun, Oct 30, 2016 at 6:36 PM, kant kodali <ka...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> I believe for ubuntu it is unlimited but I am not 100% sure (I just
>>>>>> read somewhere online). I ran ulimit -a and this is what I get
>>>>>>
>>>>>> core file size          (blocks, -c) 0
>>>>>> data seg size           (kbytes, -d) unlimited
>>>>>> scheduling priority             (-e) 0
>>>>>> file size               (blocks, -f) unlimited
>>>>>> pending signals                 (-i) 120242
>>>>>> max locked memory       (kbytes, -l) 64
>>>>>> max memory size         (kbytes, -m) unlimited
>>>>>> open files                      (-n) 1024
>>>>>> pipe size            (512 bytes, -p) 8
>>>>>> POSIX message queues     (bytes, -q) 819200
>>>>>> real-time priority              (-r) 0
>>>>>> stack size              (kbytes, -s) 8192
>>>>>> cpu time               (seconds, -t) unlimited
>>>>>> max user processes              (-u) 120242
>>>>>> virtual memory          (kbytes, -v) unlimited
>>>>>> file locks                      (-x) unlimited
>>>>>>
>>>>>> On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang <
>>>>>> chin-sh@indetail.co.jp> wrote:
>>>>>>
>>>>>>> not sure for ubuntu, but i think you can just create the file by
>>>>>>> yourself
>>>>>>> the syntax will be the same as /etc/security/limits.conf
>>>>>>>
>>>>>>> nproc.conf not only limit java process but all process by the same
>>>>>>> user
>>>>>>> so even the jvm process does nothing,  if the corresponding user is
>>>>>>> busy in other way
>>>>>>> the jvm process will still not able to create new thread.
>>>>>>>
>>>>>>> btw the default limit for centos is 1024
>>>>>>>
>>>>>>>
>>>>>>> On 10/31/16 9:51 AM, kant kodali wrote:
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <
>>>>>>> chin-sh@indetail.co.jp> wrote:
>>>>>>>
>>>>>>>> /etc/security/limits.d/90-nproc.conf
>>>>>>>>
>>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> I am using Ubuntu 16.04 LTS. I have this directory
>>>>>>> /etc/security/limits.d/ but I don't have any files underneath it. This
>>>>>>> error happens after running for 4 to 5 hours. I wonder if this is a GC
>>>>>>> issue? And I am thinking if I should use CMS. I have also posted this on SO
>>>>>>> since I havent got much response for this question
>>>>>>> http://stackoverflow.com/questions/40315589/dag-sch
>>>>>>> eduler-event-loop-java-lang-outofmemoryerror-unable-to-creat
>>>>>>> e-new-native
>>>>>>>
>>>>>>>
>>>>>>> Thanks,
>>>>>>> kant
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> ---*------------------------------------------------*---*---*---*---
>>>>>>> 株式会社INDETAIL
>>>>>>> ニアショア総合サービス事業本部
>>>>>>> ゲームサービス事業部
>>>>>>> 陳 楚鵬
>>>>>>> E-mail :chin-sh@indetail.co.jp
>>>>>>> URL : http://www.indetail.co.jp
>>>>>>>
>>>>>>> 【札幌本社/LABO/LABO2】
>>>>>>> 〒060-0042
>>>>>>> 札幌市中央区大通西9丁目3番地33
>>>>>>> キタコーセンタービルディング
>>>>>>> (札幌本社/LABO2:2階、LABO:9階)
>>>>>>> TEL:011-206-9235 FAX:011-206-9236
>>>>>>>
>>>>>>> 【東京支店】
>>>>>>> 〒108-0014
>>>>>>> 東京都港区芝5丁目29番20号 クロスオフィス三田
>>>>>>> TEL:03-6809-6502 FAX:03-6809-6504
>>>>>>>
>>>>>>> 【名古屋サテライト】
>>>>>>> 〒460-0002
>>>>>>> 愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
>>>>>>> TEL:052-971-0086
>>>>>>>
>>>>>>>
>>>
>>
>

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by kant kodali <ka...@gmail.com>.
Hi Vadim,

Thank you so much this was a very useful command. This conversation is
going on here

https://www.mail-archive.com/user@spark.apache.org/msg58656.html

or you can just google "

why spark driver program is creating so many threads? How can I limit this
number?
<https://www.mail-archive.com/search?l=user@spark.apache.org&q=subject:%22Re%5C%3A+why+spark+driver+program+is+creating+so+many+threads%5C%3F+How+can+I+limit+this+number%5C%3F%22&o=newest>
"

please take a look if you are interested.

Thanks a lot!

On Mon, Oct 31, 2016 at 8:14 AM, Vadim Semenov <va...@datadoghq.com>
wrote:

> Have you tried to get number of threads in a running process using `cat
> /proc/<pid>/status` ?
>
> On Sun, Oct 30, 2016 at 11:04 PM, kant kodali <ka...@gmail.com> wrote:
>
>> yes I did run ps -ef | grep "app_name" and it is root.
>>
>>
>>
>> On Sun, Oct 30, 2016 at 8:00 PM, Chan Chor Pang <ch...@indetail.co.jp>
>> wrote:
>>
>>> sorry, the UID
>>>
>>> On 10/31/16 11:59 AM, Chan Chor Pang wrote:
>>>
>>> actually if the max user processes is not the problem, i have no idea
>>>
>>> but i still suspecting the user,
>>> as the user who run spark-submit is not necessary the pid for the JVM
>>> process
>>>
>>> can u make sure when you "ps -ef | grep {your app id} " the PID is root?
>>> On 10/31/16 11:21 AM, kant kodali wrote:
>>>
>>> The java process is run by the root and it has the same config
>>>
>>> sudo -i
>>>
>>> ulimit -a
>>>
>>> core file size          (blocks, -c) 0
>>> data seg size           (kbytes, -d) unlimited
>>> scheduling priority             (-e) 0
>>> file size               (blocks, -f) unlimited
>>> pending signals                 (-i) 120242
>>> max locked memory       (kbytes, -l) 64
>>> max memory size         (kbytes, -m) unlimited
>>> open files                      (-n) 1024
>>> pipe size            (512 bytes, -p) 8
>>> POSIX message queues     (bytes, -q) 819200
>>> real-time priority              (-r) 0
>>> stack size              (kbytes, -s) 8192
>>> cpu time               (seconds, -t) unlimited
>>> max user processes              (-u) 120242
>>> virtual memory          (kbytes, -v) unlimited
>>> file locks                      (-x) unlimited
>>>
>>>
>>>
>>> On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang <ch...@indetail.co.jp>
>>> wrote:
>>>
>>>> I have the same Exception before and the problem fix after i change the
>>>> nproc conf.
>>>>
>>>> > max user processes              (-u) 120242
>>>> ↑this config does looks good.
>>>> are u sure the user who run ulimit -a is the same user who run the Java
>>>> process?
>>>> depend on how u submit the job and your setting, spark job may execute
>>>> by other user.
>>>>
>>>>
>>>> On 10/31/16 10:38 AM, kant kodali wrote:
>>>>
>>>> when I did this
>>>>
>>>> cat /proc/sys/kernel/pid_max
>>>>
>>>> I got 32768
>>>>
>>>> On Sun, Oct 30, 2016 at 6:36 PM, kant kodali <ka...@gmail.com>
>>>> wrote:
>>>>
>>>>> I believe for ubuntu it is unlimited but I am not 100% sure (I just
>>>>> read somewhere online). I ran ulimit -a and this is what I get
>>>>>
>>>>> core file size          (blocks, -c) 0
>>>>> data seg size           (kbytes, -d) unlimited
>>>>> scheduling priority             (-e) 0
>>>>> file size               (blocks, -f) unlimited
>>>>> pending signals                 (-i) 120242
>>>>> max locked memory       (kbytes, -l) 64
>>>>> max memory size         (kbytes, -m) unlimited
>>>>> open files                      (-n) 1024
>>>>> pipe size            (512 bytes, -p) 8
>>>>> POSIX message queues     (bytes, -q) 819200
>>>>> real-time priority              (-r) 0
>>>>> stack size              (kbytes, -s) 8192
>>>>> cpu time               (seconds, -t) unlimited
>>>>> max user processes              (-u) 120242
>>>>> virtual memory          (kbytes, -v) unlimited
>>>>> file locks                      (-x) unlimited
>>>>>
>>>>> On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang <
>>>>> chin-sh@indetail.co.jp> wrote:
>>>>>
>>>>>> not sure for ubuntu, but i think you can just create the file by
>>>>>> yourself
>>>>>> the syntax will be the same as /etc/security/limits.conf
>>>>>>
>>>>>> nproc.conf not only limit java process but all process by the same
>>>>>> user
>>>>>> so even the jvm process does nothing,  if the corresponding user is
>>>>>> busy in other way
>>>>>> the jvm process will still not able to create new thread.
>>>>>>
>>>>>> btw the default limit for centos is 1024
>>>>>>
>>>>>>
>>>>>> On 10/31/16 9:51 AM, kant kodali wrote:
>>>>>>
>>>>>>
>>>>>> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <
>>>>>> chin-sh@indetail.co.jp> wrote:
>>>>>>
>>>>>>> /etc/security/limits.d/90-nproc.conf
>>>>>>>
>>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> I am using Ubuntu 16.04 LTS. I have this directory
>>>>>> /etc/security/limits.d/ but I don't have any files underneath it. This
>>>>>> error happens after running for 4 to 5 hours. I wonder if this is a GC
>>>>>> issue? And I am thinking if I should use CMS. I have also posted this on SO
>>>>>> since I havent got much response for this question
>>>>>> http://stackoverflow.com/questions/40315589/dag-sch
>>>>>> eduler-event-loop-java-lang-outofmemoryerror-unable-to-creat
>>>>>> e-new-native
>>>>>>
>>>>>>
>>>>>> Thanks,
>>>>>> kant
>>>>>>
>>>>>>
>>>>>> --
>>>>>> ---*------------------------------------------------*---*---*---*---
>>>>>> 株式会社INDETAIL
>>>>>> ニアショア総合サービス事業本部
>>>>>> ゲームサービス事業部
>>>>>> 陳 楚鵬
>>>>>> E-mail :chin-sh@indetail.co.jp
>>>>>> URL : http://www.indetail.co.jp
>>>>>>
>>>>>> 【札幌本社/LABO/LABO2】
>>>>>> 〒060-0042
>>>>>> 札幌市中央区大通西9丁目3番地33
>>>>>> キタコーセンタービルディング
>>>>>> (札幌本社/LABO2:2階、LABO:9階)
>>>>>> TEL:011-206-9235 FAX:011-206-9236
>>>>>>
>>>>>> 【東京支店】
>>>>>> 〒108-0014
>>>>>> 東京都港区芝5丁目29番20号 クロスオフィス三田
>>>>>> TEL:03-6809-6502 FAX:03-6809-6504
>>>>>>
>>>>>> 【名古屋サテライト】
>>>>>> 〒460-0002
>>>>>> 愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
>>>>>> TEL:052-971-0086
>>>>>>
>>>>>>
>>
>

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by Vadim Semenov <va...@datadoghq.com>.
Have you tried to get number of threads in a running process using `cat
/proc/<pid>/status` ?

On Sun, Oct 30, 2016 at 11:04 PM, kant kodali <ka...@gmail.com> wrote:

> yes I did run ps -ef | grep "app_name" and it is root.
>
>
>
> On Sun, Oct 30, 2016 at 8:00 PM, Chan Chor Pang <ch...@indetail.co.jp>
> wrote:
>
>> sorry, the UID
>>
>> On 10/31/16 11:59 AM, Chan Chor Pang wrote:
>>
>> actually if the max user processes is not the problem, i have no idea
>>
>> but i still suspecting the user,
>> as the user who run spark-submit is not necessary the pid for the JVM
>> process
>>
>> can u make sure when you "ps -ef | grep {your app id} " the PID is root?
>> On 10/31/16 11:21 AM, kant kodali wrote:
>>
>> The java process is run by the root and it has the same config
>>
>> sudo -i
>>
>> ulimit -a
>>
>> core file size          (blocks, -c) 0
>> data seg size           (kbytes, -d) unlimited
>> scheduling priority             (-e) 0
>> file size               (blocks, -f) unlimited
>> pending signals                 (-i) 120242
>> max locked memory       (kbytes, -l) 64
>> max memory size         (kbytes, -m) unlimited
>> open files                      (-n) 1024
>> pipe size            (512 bytes, -p) 8
>> POSIX message queues     (bytes, -q) 819200
>> real-time priority              (-r) 0
>> stack size              (kbytes, -s) 8192
>> cpu time               (seconds, -t) unlimited
>> max user processes              (-u) 120242
>> virtual memory          (kbytes, -v) unlimited
>> file locks                      (-x) unlimited
>>
>>
>>
>> On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang <ch...@indetail.co.jp>
>> wrote:
>>
>>> I have the same Exception before and the problem fix after i change the
>>> nproc conf.
>>>
>>> > max user processes              (-u) 120242
>>> ↑this config does looks good.
>>> are u sure the user who run ulimit -a is the same user who run the Java
>>> process?
>>> depend on how u submit the job and your setting, spark job may execute
>>> by other user.
>>>
>>>
>>> On 10/31/16 10:38 AM, kant kodali wrote:
>>>
>>> when I did this
>>>
>>> cat /proc/sys/kernel/pid_max
>>>
>>> I got 32768
>>>
>>> On Sun, Oct 30, 2016 at 6:36 PM, kant kodali <ka...@gmail.com> wrote:
>>>
>>>> I believe for ubuntu it is unlimited but I am not 100% sure (I just
>>>> read somewhere online). I ran ulimit -a and this is what I get
>>>>
>>>> core file size          (blocks, -c) 0
>>>> data seg size           (kbytes, -d) unlimited
>>>> scheduling priority             (-e) 0
>>>> file size               (blocks, -f) unlimited
>>>> pending signals                 (-i) 120242
>>>> max locked memory       (kbytes, -l) 64
>>>> max memory size         (kbytes, -m) unlimited
>>>> open files                      (-n) 1024
>>>> pipe size            (512 bytes, -p) 8
>>>> POSIX message queues     (bytes, -q) 819200
>>>> real-time priority              (-r) 0
>>>> stack size              (kbytes, -s) 8192
>>>> cpu time               (seconds, -t) unlimited
>>>> max user processes              (-u) 120242
>>>> virtual memory          (kbytes, -v) unlimited
>>>> file locks                      (-x) unlimited
>>>>
>>>> On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang <chin-sh@indetail.co.jp
>>>> > wrote:
>>>>
>>>>> not sure for ubuntu, but i think you can just create the file by
>>>>> yourself
>>>>> the syntax will be the same as /etc/security/limits.conf
>>>>>
>>>>> nproc.conf not only limit java process but all process by the same user
>>>>> so even the jvm process does nothing,  if the corresponding user is
>>>>> busy in other way
>>>>> the jvm process will still not able to create new thread.
>>>>>
>>>>> btw the default limit for centos is 1024
>>>>>
>>>>>
>>>>> On 10/31/16 9:51 AM, kant kodali wrote:
>>>>>
>>>>>
>>>>> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <
>>>>> chin-sh@indetail.co.jp> wrote:
>>>>>
>>>>>> /etc/security/limits.d/90-nproc.conf
>>>>>>
>>>>>
>>>>> Hi,
>>>>>
>>>>> I am using Ubuntu 16.04 LTS. I have this directory
>>>>> /etc/security/limits.d/ but I don't have any files underneath it. This
>>>>> error happens after running for 4 to 5 hours. I wonder if this is a GC
>>>>> issue? And I am thinking if I should use CMS. I have also posted this on SO
>>>>> since I havent got much response for this question
>>>>> http://stackoverflow.com/questions/40315589/dag-sch
>>>>> eduler-event-loop-java-lang-outofmemoryerror-unable-to-creat
>>>>> e-new-native
>>>>>
>>>>>
>>>>> Thanks,
>>>>> kant
>>>>>
>>>>>
>>>>> --
>>>>> ---*------------------------------------------------*---*---*---*---
>>>>> 株式会社INDETAIL
>>>>> ニアショア総合サービス事業本部
>>>>> ゲームサービス事業部
>>>>> 陳 楚鵬
>>>>> E-mail :chin-sh@indetail.co.jp
>>>>> URL : http://www.indetail.co.jp
>>>>>
>>>>> 【札幌本社/LABO/LABO2】
>>>>> 〒060-0042
>>>>> 札幌市中央区大通西9丁目3番地33
>>>>> キタコーセンタービルディング
>>>>> (札幌本社/LABO2:2階、LABO:9階)
>>>>> TEL:011-206-9235 FAX:011-206-9236
>>>>>
>>>>> 【東京支店】
>>>>> 〒108-0014
>>>>> 東京都港区芝5丁目29番20号 クロスオフィス三田
>>>>> TEL:03-6809-6502 FAX:03-6809-6504
>>>>>
>>>>> 【名古屋サテライト】
>>>>> 〒460-0002
>>>>> 愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
>>>>> TEL:052-971-0086
>>>>>
>>>>>
>

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by kant kodali <ka...@gmail.com>.
yes I did run ps -ef | grep "app_name" and it is root.



On Sun, Oct 30, 2016 at 8:00 PM, Chan Chor Pang <ch...@indetail.co.jp>
wrote:

> sorry, the UID
>
> On 10/31/16 11:59 AM, Chan Chor Pang wrote:
>
> actually if the max user processes is not the problem, i have no idea
>
> but i still suspecting the user,
> as the user who run spark-submit is not necessary the pid for the JVM
> process
>
> can u make sure when you "ps -ef | grep {your app id} " the PID is root?
> On 10/31/16 11:21 AM, kant kodali wrote:
>
> The java process is run by the root and it has the same config
>
> sudo -i
>
> ulimit -a
>
> core file size          (blocks, -c) 0
> data seg size           (kbytes, -d) unlimited
> scheduling priority             (-e) 0
> file size               (blocks, -f) unlimited
> pending signals                 (-i) 120242
> max locked memory       (kbytes, -l) 64
> max memory size         (kbytes, -m) unlimited
> open files                      (-n) 1024
> pipe size            (512 bytes, -p) 8
> POSIX message queues     (bytes, -q) 819200
> real-time priority              (-r) 0
> stack size              (kbytes, -s) 8192
> cpu time               (seconds, -t) unlimited
> max user processes              (-u) 120242
> virtual memory          (kbytes, -v) unlimited
> file locks                      (-x) unlimited
>
>
>
> On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang <ch...@indetail.co.jp>
> wrote:
>
>> I have the same Exception before and the problem fix after i change the
>> nproc conf.
>>
>> > max user processes              (-u) 120242
>> ↑this config does looks good.
>> are u sure the user who run ulimit -a is the same user who run the Java
>> process?
>> depend on how u submit the job and your setting, spark job may execute by
>> other user.
>>
>>
>> On 10/31/16 10:38 AM, kant kodali wrote:
>>
>> when I did this
>>
>> cat /proc/sys/kernel/pid_max
>>
>> I got 32768
>>
>> On Sun, Oct 30, 2016 at 6:36 PM, kant kodali <ka...@gmail.com> wrote:
>>
>>> I believe for ubuntu it is unlimited but I am not 100% sure (I just read
>>> somewhere online). I ran ulimit -a and this is what I get
>>>
>>> core file size          (blocks, -c) 0
>>> data seg size           (kbytes, -d) unlimited
>>> scheduling priority             (-e) 0
>>> file size               (blocks, -f) unlimited
>>> pending signals                 (-i) 120242
>>> max locked memory       (kbytes, -l) 64
>>> max memory size         (kbytes, -m) unlimited
>>> open files                      (-n) 1024
>>> pipe size            (512 bytes, -p) 8
>>> POSIX message queues     (bytes, -q) 819200
>>> real-time priority              (-r) 0
>>> stack size              (kbytes, -s) 8192
>>> cpu time               (seconds, -t) unlimited
>>> max user processes              (-u) 120242
>>> virtual memory          (kbytes, -v) unlimited
>>> file locks                      (-x) unlimited
>>>
>>> On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang <ch...@indetail.co.jp>
>>> wrote:
>>>
>>>> not sure for ubuntu, but i think you can just create the file by
>>>> yourself
>>>> the syntax will be the same as /etc/security/limits.conf
>>>>
>>>> nproc.conf not only limit java process but all process by the same user
>>>> so even the jvm process does nothing,  if the corresponding user is
>>>> busy in other way
>>>> the jvm process will still not able to create new thread.
>>>>
>>>> btw the default limit for centos is 1024
>>>>
>>>>
>>>> On 10/31/16 9:51 AM, kant kodali wrote:
>>>>
>>>>
>>>> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <chin-sh@indetail.co.jp
>>>> > wrote:
>>>>
>>>>> /etc/security/limits.d/90-nproc.conf
>>>>>
>>>>
>>>> Hi,
>>>>
>>>> I am using Ubuntu 16.04 LTS. I have this directory
>>>> /etc/security/limits.d/ but I don't have any files underneath it. This
>>>> error happens after running for 4 to 5 hours. I wonder if this is a GC
>>>> issue? And I am thinking if I should use CMS. I have also posted this on SO
>>>> since I havent got much response for this question
>>>> http://stackoverflow.com/questions/40315589/dag-sch
>>>> eduler-event-loop-java-lang-outofmemoryerror-unable-to-creat
>>>> e-new-native
>>>>
>>>>
>>>> Thanks,
>>>> kant
>>>>
>>>>
>>>> --
>>>> ---*------------------------------------------------*---*---*---*---
>>>> 株式会社INDETAIL
>>>> ニアショア総合サービス事業本部
>>>> ゲームサービス事業部
>>>> 陳 楚鵬
>>>> E-mail :chin-sh@indetail.co.jp
>>>> URL : http://www.indetail.co.jp
>>>>
>>>> 【札幌本社/LABO/LABO2】
>>>> 〒060-0042
>>>> 札幌市中央区大通西9丁目3番地33
>>>> キタコーセンタービルディング
>>>> (札幌本社/LABO2:2階、LABO:9階)
>>>> TEL:011-206-9235 FAX:011-206-9236
>>>>
>>>> 【東京支店】
>>>> 〒108-0014
>>>> 東京都港区芝5丁目29番20号 クロスオフィス三田
>>>> TEL:03-6809-6502 FAX:03-6809-6504
>>>>
>>>> 【名古屋サテライト】
>>>> 〒460-0002
>>>> 愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
>>>> TEL:052-971-0086
>>>>
>>>>

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by Chan Chor Pang <ch...@indetail.co.jp>.
sorry, the UID


On 10/31/16 11:59 AM, Chan Chor Pang wrote:
>
> actually if the max user processes is not the problem, i have no idea
>
> but i still suspecting the user,
> as the user who run spark-submit is not necessary the pid for the JVM 
> process
>
> can u make sure when you "ps -ef | grep {your app id} " the PID is root?
>
> On 10/31/16 11:21 AM, kant kodali wrote:
>> The java process is run by the root and it has the same config
>>
>> sudo -i
>>
>> ulimit -a
>>
>> core file size          (blocks, -c) 0
>> data seg size           (kbytes, -d) unlimited
>> scheduling priority             (-e) 0
>> file size               (blocks, -f) unlimited
>> pending signals                 (-i) 120242
>> max locked memory       (kbytes, -l) 64
>> max memory size         (kbytes, -m) unlimited
>> open files                      (-n) 1024
>> pipe size            (512 bytes, -p) 8
>> POSIX message queues     (bytes, -q) 819200
>> real-time priority              (-r) 0
>> stack size              (kbytes, -s) 8192
>> cpu time               (seconds, -t) unlimited
>> max user processes              (-u) 120242
>> virtual memory          (kbytes, -v) unlimited
>> file locks                      (-x) unlimited
>>
>>
>>
>> On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang 
>> <chin-sh@indetail.co.jp <ma...@indetail.co.jp>> wrote:
>>
>>     I have the same Exception before and the problem fix after i
>>     change the nproc conf.
>>
>>     > max user processes (-u) 120242
>>     \u2191this config does looks good.
>>     are u sure the user who run ulimit -a is the same user who run
>>     the Java process?
>>     depend on how u submit the job and your setting, spark job may
>>     execute by other user.
>>
>>
>>     On 10/31/16 10:38 AM, kant kodali wrote:
>>>     when I did this
>>>
>>>     cat /proc/sys/kernel/pid_max
>>>
>>>     I got 32768
>>>
>>>
>>>     On Sun, Oct 30, 2016 at 6:36 PM, kant kodali <kanth909@gmail.com
>>>     <ma...@gmail.com>> wrote:
>>>
>>>         I believe for ubuntu it is unlimited but I am not 100% sure
>>>         (I just read somewhere online). I ran ulimit -a and this is
>>>         what I get
>>>
>>>         core file size          (blocks, -c) 0
>>>         data seg size           (kbytes, -d) unlimited
>>>         scheduling priority (-e) 0
>>>         file size               (blocks, -f) unlimited
>>>         pending signals (-i) 120242
>>>         max locked memory       (kbytes, -l) 64
>>>         max memory size         (kbytes, -m) unlimited
>>>         open files  (-n) 1024
>>>         pipe size            (512 bytes, -p) 8
>>>         POSIX message queues     (bytes, -q) 819200
>>>         real-time priority  (-r) 0
>>>         stack size              (kbytes, -s) 8192
>>>         cpu time               (seconds, -t) unlimited
>>>         max user processes  (-u) 120242
>>>         virtual memory          (kbytes, -v) unlimited
>>>         file locks  (-x) unlimited
>>>
>>>         On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang
>>>         <chin-sh@indetail.co.jp <ma...@indetail.co.jp>> wrote:
>>>
>>>             not sure for ubuntu, but i think you can just create the
>>>             file by yourself
>>>             the syntax will be the same as /etc/security/limits.conf
>>>
>>>             nproc.conf not only limit java process but all process
>>>             by the same user
>>>
>>>             so even the jvm process does nothing,  if the
>>>             corresponding user is busy in other way
>>>             the jvm process will still not able to create new thread.
>>>
>>>             btw the default limit for centos is 1024
>>>
>>>
>>>             On 10/31/16 9:51 AM, kant kodali wrote:
>>>>
>>>>             On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang
>>>>             <chin-sh@indetail.co.jp
>>>>             <ma...@indetail.co.jp>> wrote:
>>>>
>>>>                 /etc/security/limits.d/90-nproc.conf
>>>>
>>>>
>>>>             Hi,
>>>>
>>>>             I am using Ubuntu 16.04 LTS. I have this directory
>>>>             /etc/security/limits.d/ but I don't have any files
>>>>             underneath it. This error happens after running for 4
>>>>             to 5 hours. I wonder if this is a GC issue? And I am
>>>>             thinking if I should use CMS. I have also posted this
>>>>             on SO since I havent got much response for this
>>>>             question
>>>>             http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native
>>>>             <http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native>
>>>>
>>>>
>>>>             Thanks,
>>>>             kant
>>>
>>>             -- 
>>>             ---*------------------------------------------------*---*---*---*---
>>>             \u682a\u5f0f\u4f1a\u793eINDETAIL
>>>             \u30cb\u30a2\u30b7\u30e7\u30a2\u7dcf\u5408\u30b5\u30fc\u30d3\u30b9\u4e8b\u696d\u672c\u90e8
>>>             \u30b2\u30fc\u30e0\u30b5\u30fc\u30d3\u30b9\u4e8b\u696d\u90e8
>>>             \u9673\u3000\u695a\u9d6c
>>>             E-mail :chin-sh@indetail.co.jp <ma...@indetail.co.jp>
>>>             URL :http://www.indetail.co.jp
>>>
>>>             \u3010\u672d\u5e4c\u672c\u793e\uff0fLABO\uff0fLABO2\u3011
>>>             \u3012060-0042
>>>             \u672d\u5e4c\u5e02\u4e2d\u592e\u533a\u5927\u901a\u897f9\u4e01\u76ee3\u756a\u573033
>>>             \u30ad\u30bf\u30b3\u30fc\u30bb\u30f3\u30bf\u30fc\u30d3\u30eb\u30c7\u30a3\u30f3\u30b0
>>>             \uff08\u672d\u5e4c\u672c\u793e\uff0fLABO2\uff1a2\u968e\u3001LABO\uff1a9\u968e\uff09
>>>             TEL\uff1a011-206-9235 FAX\uff1a011-206-9236
>>>
>>>             \u3010\u6771\u4eac\u652f\u5e97\u3011
>>>             \u3012108-0014
>>>             \u6771\u4eac\u90fd\u6e2f\u533a\u829d5\u4e01\u76ee29\u756a20\u53f7 \u30af\u30ed\u30b9\u30aa\u30d5\u30a3\u30b9\u4e09\u7530
>>>             TEL\uff1a03-6809-6502 FAX\uff1a03-6809-6504
>>>
>>>             \u3010\u540d\u53e4\u5c4b\u30b5\u30c6\u30e9\u30a4\u30c8\u3011
>>>             \u3012460-0002
>>>             \u611b\u77e5\u770c\u540d\u53e4\u5c4b\u5e02\u4e2d\u533a\u4e38\u306e\u51853\u4e01\u76ee17\u756a24\u53f7 NAYUTA BLD
>>>             TEL\uff1a052-971-0086
>>>

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by Chan Chor Pang <ch...@indetail.co.jp>.
actually if the max user processes is not the problem, i have no idea

but i still suspecting the user,
as the user who run spark-submit is not necessary the pid for the JVM 
process

can u make sure when you "ps -ef | grep {your app id} " the PID is root?

On 10/31/16 11:21 AM, kant kodali wrote:
> The java process is run by the root and it has the same config
>
> sudo -i
>
> ulimit -a
>
> core file size          (blocks, -c) 0
> data seg size           (kbytes, -d) unlimited
> scheduling priority             (-e) 0
> file size               (blocks, -f) unlimited
> pending signals                 (-i) 120242
> max locked memory       (kbytes, -l) 64
> max memory size         (kbytes, -m) unlimited
> open files                      (-n) 1024
> pipe size            (512 bytes, -p) 8
> POSIX message queues     (bytes, -q) 819200
> real-time priority              (-r) 0
> stack size              (kbytes, -s) 8192
> cpu time               (seconds, -t) unlimited
> max user processes              (-u) 120242
> virtual memory          (kbytes, -v) unlimited
> file locks                      (-x) unlimited
>
>
>
> On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang 
> <chin-sh@indetail.co.jp <ma...@indetail.co.jp>> wrote:
>
>     I have the same Exception before and the problem fix after i
>     change the nproc conf.
>
>     > max user processes (-u) 120242
>     \u2191this config does looks good.
>     are u sure the user who run ulimit -a is the same user who run the
>     Java process?
>     depend on how u submit the job and your setting, spark job may
>     execute by other user.
>
>
>     On 10/31/16 10:38 AM, kant kodali wrote:
>>     when I did this
>>
>>     cat /proc/sys/kernel/pid_max
>>
>>     I got 32768
>>
>>
>>     On Sun, Oct 30, 2016 at 6:36 PM, kant kodali <kanth909@gmail.com
>>     <ma...@gmail.com>> wrote:
>>
>>         I believe for ubuntu it is unlimited but I am not 100% sure
>>         (I just read somewhere online). I ran ulimit -a and this is
>>         what I get
>>
>>         core file size          (blocks, -c) 0
>>         data seg size           (kbytes, -d) unlimited
>>         scheduling priority             (-e) 0
>>         file size               (blocks, -f) unlimited
>>         pending signals                 (-i) 120242
>>         max locked memory       (kbytes, -l) 64
>>         max memory size         (kbytes, -m) unlimited
>>         open files                      (-n) 1024
>>         pipe size            (512 bytes, -p) 8
>>         POSIX message queues     (bytes, -q) 819200
>>         real-time priority              (-r) 0
>>         stack size              (kbytes, -s) 8192
>>         cpu time               (seconds, -t) unlimited
>>         max user processes              (-u) 120242
>>         virtual memory          (kbytes, -v) unlimited
>>         file locks                      (-x) unlimited
>>
>>         On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang
>>         <chin-sh@indetail.co.jp <ma...@indetail.co.jp>> wrote:
>>
>>             not sure for ubuntu, but i think you can just create the
>>             file by yourself
>>             the syntax will be the same as /etc/security/limits.conf
>>
>>             nproc.conf not only limit java process but all process by
>>             the same user
>>
>>             so even the jvm process does nothing,  if the
>>             corresponding user is busy in other way
>>             the jvm process will still not able to create new thread.
>>
>>             btw the default limit for centos is 1024
>>
>>
>>             On 10/31/16 9:51 AM, kant kodali wrote:
>>>
>>>             On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang
>>>             <chin-sh@indetail.co.jp <ma...@indetail.co.jp>>
>>>             wrote:
>>>
>>>                 /etc/security/limits.d/90-nproc.conf
>>>
>>>
>>>             Hi,
>>>
>>>             I am using Ubuntu 16.04 LTS. I have this directory
>>>             /etc/security/limits.d/ but I don't have any files
>>>             underneath it. This error happens after running for 4 to
>>>             5 hours. I wonder if this is a GC issue? And I am
>>>             thinking if I should use CMS. I have also posted this on
>>>             SO since I havent got much response for this question
>>>             http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native
>>>             <http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native>
>>>
>>>
>>>             Thanks,
>>>             kant
>>
>>             -- 
>>             ---*------------------------------------------------*---*---*---*---
>>             \u682a\u5f0f\u4f1a\u793eINDETAIL
>>             \u30cb\u30a2\u30b7\u30e7\u30a2\u7dcf\u5408\u30b5\u30fc\u30d3\u30b9\u4e8b\u696d\u672c\u90e8
>>             \u30b2\u30fc\u30e0\u30b5\u30fc\u30d3\u30b9\u4e8b\u696d\u90e8
>>             \u9673\u3000\u695a\u9d6c
>>             E-mail :chin-sh@indetail.co.jp <ma...@indetail.co.jp>
>>             URL :http://www.indetail.co.jp
>>
>>             \u3010\u672d\u5e4c\u672c\u793e\uff0fLABO\uff0fLABO2\u3011
>>             \u3012060-0042
>>             \u672d\u5e4c\u5e02\u4e2d\u592e\u533a\u5927\u901a\u897f9\u4e01\u76ee3\u756a\u573033
>>             \u30ad\u30bf\u30b3\u30fc\u30bb\u30f3\u30bf\u30fc\u30d3\u30eb\u30c7\u30a3\u30f3\u30b0
>>             \uff08\u672d\u5e4c\u672c\u793e\uff0fLABO2\uff1a2\u968e\u3001LABO\uff1a9\u968e\uff09
>>             TEL\uff1a011-206-9235 FAX\uff1a011-206-9236
>>
>>             \u3010\u6771\u4eac\u652f\u5e97\u3011
>>             \u3012108-0014
>>             \u6771\u4eac\u90fd\u6e2f\u533a\u829d5\u4e01\u76ee29\u756a20\u53f7 \u30af\u30ed\u30b9\u30aa\u30d5\u30a3\u30b9\u4e09\u7530
>>             TEL\uff1a03-6809-6502 FAX\uff1a03-6809-6504
>>
>>             \u3010\u540d\u53e4\u5c4b\u30b5\u30c6\u30e9\u30a4\u30c8\u3011
>>             \u3012460-0002
>>             \u611b\u77e5\u770c\u540d\u53e4\u5c4b\u5e02\u4e2d\u533a\u4e38\u306e\u51853\u4e01\u76ee17\u756a24\u53f7 NAYUTA BLD
>>             TEL\uff1a052-971-0086
>>

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by kant kodali <ka...@gmail.com>.
The java process is run by the root and it has the same config

sudo -i

ulimit -a

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 120242
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 120242
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited



On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang <ch...@indetail.co.jp>
wrote:

> I have the same Exception before and the problem fix after i change the
> nproc conf.
>
> > max user processes              (-u) 120242
> ↑this config does looks good.
> are u sure the user who run ulimit -a is the same user who run the Java
> process?
> depend on how u submit the job and your setting, spark job may execute by
> other user.
>
>
> On 10/31/16 10:38 AM, kant kodali wrote:
>
> when I did this
>
> cat /proc/sys/kernel/pid_max
>
> I got 32768
>
> On Sun, Oct 30, 2016 at 6:36 PM, kant kodali <ka...@gmail.com> wrote:
>
>> I believe for ubuntu it is unlimited but I am not 100% sure (I just read
>> somewhere online). I ran ulimit -a and this is what I get
>>
>> core file size          (blocks, -c) 0
>> data seg size           (kbytes, -d) unlimited
>> scheduling priority             (-e) 0
>> file size               (blocks, -f) unlimited
>> pending signals                 (-i) 120242
>> max locked memory       (kbytes, -l) 64
>> max memory size         (kbytes, -m) unlimited
>> open files                      (-n) 1024
>> pipe size            (512 bytes, -p) 8
>> POSIX message queues     (bytes, -q) 819200
>> real-time priority              (-r) 0
>> stack size              (kbytes, -s) 8192
>> cpu time               (seconds, -t) unlimited
>> max user processes              (-u) 120242
>> virtual memory          (kbytes, -v) unlimited
>> file locks                      (-x) unlimited
>>
>> On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang <ch...@indetail.co.jp>
>> wrote:
>>
>>> not sure for ubuntu, but i think you can just create the file by yourself
>>> the syntax will be the same as /etc/security/limits.conf
>>>
>>> nproc.conf not only limit java process but all process by the same user
>>> so even the jvm process does nothing,  if the corresponding user is busy
>>> in other way
>>> the jvm process will still not able to create new thread.
>>>
>>> btw the default limit for centos is 1024
>>>
>>>
>>> On 10/31/16 9:51 AM, kant kodali wrote:
>>>
>>>
>>> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <ch...@indetail.co.jp>
>>> wrote:
>>>
>>>> /etc/security/limits.d/90-nproc.conf
>>>>
>>>
>>> Hi,
>>>
>>> I am using Ubuntu 16.04 LTS. I have this directory
>>> /etc/security/limits.d/ but I don't have any files underneath it. This
>>> error happens after running for 4 to 5 hours. I wonder if this is a GC
>>> issue? And I am thinking if I should use CMS. I have also posted this on SO
>>> since I havent got much response for this question http://stackoverflow.
>>> com/questions/40315589/dag-scheduler-event-loop-java-lang-ou
>>> tofmemoryerror-unable-to-create-new-native
>>>
>>>
>>> Thanks,
>>> kant
>>>
>>>
>>> --
>>> ---*------------------------------------------------*---*---*---*---
>>> 株式会社INDETAIL
>>> ニアショア総合サービス事業本部
>>> ゲームサービス事業部
>>> 陳 楚鵬
>>> E-mail :chin-sh@indetail.co.jp
>>> URL : http://www.indetail.co.jp
>>>
>>> 【札幌本社/LABO/LABO2】
>>> 〒060-0042
>>> 札幌市中央区大通西9丁目3番地33
>>> キタコーセンタービルディング
>>> (札幌本社/LABO2:2階、LABO:9階)
>>> TEL:011-206-9235 FAX:011-206-9236
>>>
>>> 【東京支店】
>>> 〒108-0014
>>> 東京都港区芝5丁目29番20号 クロスオフィス三田
>>> TEL:03-6809-6502 FAX:03-6809-6504
>>>
>>> 【名古屋サテライト】
>>> 〒460-0002
>>> 愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
>>> TEL:052-971-0086
>>>
>>>

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by Chan Chor Pang <ch...@indetail.co.jp>.
I have the same Exception before and the problem fix after i change the 
nproc conf.

 > max user processes              (-u) 120242
\u2191this config does looks good.
are u sure the user who run ulimit -a is the same user who run the Java 
process?
depend on how u submit the job and your setting, spark job may execute 
by other user.


On 10/31/16 10:38 AM, kant kodali wrote:
> when I did this
>
> cat /proc/sys/kernel/pid_max
>
> I got 32768
>
>
> On Sun, Oct 30, 2016 at 6:36 PM, kant kodali <kanth909@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     I believe for ubuntu it is unlimited but I am not 100% sure (I
>     just read somewhere online). I ran ulimit -a and this is what I get
>
>     core file size          (blocks, -c) 0
>     data seg size           (kbytes, -d) unlimited
>     scheduling priority             (-e) 0
>     file size               (blocks, -f) unlimited
>     pending signals                 (-i) 120242
>     max locked memory       (kbytes, -l) 64
>     max memory size         (kbytes, -m) unlimited
>     open files                      (-n) 1024
>     pipe size            (512 bytes, -p) 8
>     POSIX message queues     (bytes, -q) 819200
>     real-time priority              (-r) 0
>     stack size              (kbytes, -s) 8192
>     cpu time               (seconds, -t) unlimited
>     max user processes              (-u) 120242
>     virtual memory          (kbytes, -v) unlimited
>     file locks                      (-x) unlimited
>
>     On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang
>     <chin-sh@indetail.co.jp <ma...@indetail.co.jp>> wrote:
>
>         not sure for ubuntu, but i think you can just create the file
>         by yourself
>         the syntax will be the same as /etc/security/limits.conf
>
>         nproc.conf not only limit java process but all process by the
>         same user
>
>         so even the jvm process does nothing,  if the corresponding
>         user is busy in other way
>         the jvm process will still not able to create new thread.
>
>         btw the default limit for centos is 1024
>
>
>         On 10/31/16 9:51 AM, kant kodali wrote:
>>
>>         On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang
>>         <chin-sh@indetail.co.jp <ma...@indetail.co.jp>> wrote:
>>
>>             /etc/security/limits.d/90-nproc.conf
>>
>>
>>         Hi,
>>
>>         I am using Ubuntu 16.04 LTS. I have this directory
>>         /etc/security/limits.d/ but I don't have any files underneath
>>         it. This error happens after running for 4 to 5 hours. I
>>         wonder if this is a GC issue? And I am thinking if I should
>>         use CMS. I have also posted this on SO since I havent got
>>         much response for this question
>>         http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native
>>         <http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native>
>>
>>
>>         Thanks,
>>         kant
>
>         -- 
>         ---*------------------------------------------------*---*---*---*---
>         \u682a\u5f0f\u4f1a\u793eINDETAIL
>         \u30cb\u30a2\u30b7\u30e7\u30a2\u7dcf\u5408\u30b5\u30fc\u30d3\u30b9\u4e8b\u696d\u672c\u90e8
>         \u30b2\u30fc\u30e0\u30b5\u30fc\u30d3\u30b9\u4e8b\u696d\u90e8
>         \u9673\u3000\u695a\u9d6c
>         E-mail :chin-sh@indetail.co.jp <ma...@indetail.co.jp>
>         URL :http://www.indetail.co.jp
>
>         \u3010\u672d\u5e4c\u672c\u793e\uff0fLABO\uff0fLABO2\u3011
>         \u3012060-0042
>         \u672d\u5e4c\u5e02\u4e2d\u592e\u533a\u5927\u901a\u897f9\u4e01\u76ee3\u756a\u573033
>         \u30ad\u30bf\u30b3\u30fc\u30bb\u30f3\u30bf\u30fc\u30d3\u30eb\u30c7\u30a3\u30f3\u30b0
>         \uff08\u672d\u5e4c\u672c\u793e\uff0fLABO2\uff1a2\u968e\u3001LABO\uff1a9\u968e\uff09
>         TEL\uff1a011-206-9235 FAX\uff1a011-206-9236
>
>         \u3010\u6771\u4eac\u652f\u5e97\u3011
>         \u3012108-0014
>         \u6771\u4eac\u90fd\u6e2f\u533a\u829d5\u4e01\u76ee29\u756a20\u53f7 \u30af\u30ed\u30b9\u30aa\u30d5\u30a3\u30b9\u4e09\u7530
>         TEL\uff1a03-6809-6502 FAX\uff1a03-6809-6504
>
>         \u3010\u540d\u53e4\u5c4b\u30b5\u30c6\u30e9\u30a4\u30c8\u3011
>         \u3012460-0002
>         \u611b\u77e5\u770c\u540d\u53e4\u5c4b\u5e02\u4e2d\u533a\u4e38\u306e\u51853\u4e01\u76ee17\u756a24\u53f7 NAYUTA BLD
>         TEL\uff1a052-971-0086
>

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by kant kodali <ka...@gmail.com>.
when I did this

cat /proc/sys/kernel/pid_max

I got 32768

On Sun, Oct 30, 2016 at 6:36 PM, kant kodali <ka...@gmail.com> wrote:

> I believe for ubuntu it is unlimited but I am not 100% sure (I just read
> somewhere online). I ran ulimit -a and this is what I get
>
> core file size          (blocks, -c) 0
> data seg size           (kbytes, -d) unlimited
> scheduling priority             (-e) 0
> file size               (blocks, -f) unlimited
> pending signals                 (-i) 120242
> max locked memory       (kbytes, -l) 64
> max memory size         (kbytes, -m) unlimited
> open files                      (-n) 1024
> pipe size            (512 bytes, -p) 8
> POSIX message queues     (bytes, -q) 819200
> real-time priority              (-r) 0
> stack size              (kbytes, -s) 8192
> cpu time               (seconds, -t) unlimited
> max user processes              (-u) 120242
> virtual memory          (kbytes, -v) unlimited
> file locks                      (-x) unlimited
>
> On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang <ch...@indetail.co.jp>
> wrote:
>
>> not sure for ubuntu, but i think you can just create the file by yourself
>> the syntax will be the same as /etc/security/limits.conf
>>
>> nproc.conf not only limit java process but all process by the same user
>> so even the jvm process does nothing,  if the corresponding user is busy
>> in other way
>> the jvm process will still not able to create new thread.
>>
>> btw the default limit for centos is 1024
>>
>>
>> On 10/31/16 9:51 AM, kant kodali wrote:
>>
>>
>> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <ch...@indetail.co.jp>
>> wrote:
>>
>>> /etc/security/limits.d/90-nproc.conf
>>>
>>
>> Hi,
>>
>> I am using Ubuntu 16.04 LTS. I have this directory
>> /etc/security/limits.d/ but I don't have any files underneath it. This
>> error happens after running for 4 to 5 hours. I wonder if this is a GC
>> issue? And I am thinking if I should use CMS. I have also posted this on SO
>> since I havent got much response for this question http://stackoverflow.
>> com/questions/40315589/dag-scheduler-event-loop-java-lang-
>> outofmemoryerror-unable-to-create-new-native
>>
>>
>> Thanks,
>> kant
>>
>>
>> --
>> ---*------------------------------------------------*---*---*---*---
>> 株式会社INDETAIL
>> ニアショア総合サービス事業本部
>> ゲームサービス事業部
>> 陳 楚鵬
>> E-mail :chin-sh@indetail.co.jp
>> URL : http://www.indetail.co.jp
>>
>> 【札幌本社/LABO/LABO2】
>> 〒060-0042
>> 札幌市中央区大通西9丁目3番地33
>> キタコーセンタービルディング
>> (札幌本社/LABO2:2階、LABO:9階)
>> TEL:011-206-9235 FAX:011-206-9236
>>
>> 【東京支店】
>> 〒108-0014
>> 東京都港区芝5丁目29番20号 クロスオフィス三田
>> TEL:03-6809-6502 FAX:03-6809-6504
>>
>> 【名古屋サテライト】
>> 〒460-0002
>> 愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
>> TEL:052-971-0086
>>
>>
>

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by kant kodali <ka...@gmail.com>.
I believe for ubuntu it is unlimited but I am not 100% sure (I just read
somewhere online). I ran ulimit -a and this is what I get

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 120242
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 120242
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang <ch...@indetail.co.jp>
wrote:

> not sure for ubuntu, but i think you can just create the file by yourself
> the syntax will be the same as /etc/security/limits.conf
>
> nproc.conf not only limit java process but all process by the same user
> so even the jvm process does nothing,  if the corresponding user is busy
> in other way
> the jvm process will still not able to create new thread.
>
> btw the default limit for centos is 1024
>
>
> On 10/31/16 9:51 AM, kant kodali wrote:
>
>
> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <ch...@indetail.co.jp>
> wrote:
>
>> /etc/security/limits.d/90-nproc.conf
>>
>
> Hi,
>
> I am using Ubuntu 16.04 LTS. I have this directory /etc/security/limits.d/
> but I don't have any files underneath it. This error happens after running
> for 4 to 5 hours. I wonder if this is a GC issue? And I am thinking if I
> should use CMS. I have also posted this on SO since I havent got much
> response for this question http://stackoverflow.
> com/questions/40315589/dag-scheduler-event-loop-java-
> lang-outofmemoryerror-unable-to-create-new-native
>
>
> Thanks,
> kant
>
>
> --
> ---*------------------------------------------------*---*---*---*---
> 株式会社INDETAIL
> ニアショア総合サービス事業本部
> ゲームサービス事業部
> 陳 楚鵬
> E-mail :chin-sh@indetail.co.jp
> URL : http://www.indetail.co.jp
>
> 【札幌本社/LABO/LABO2】
> 〒060-0042
> 札幌市中央区大通西9丁目3番地33
> キタコーセンタービルディング
> (札幌本社/LABO2:2階、LABO:9階)
> TEL:011-206-9235 FAX:011-206-9236
>
> 【東京支店】
> 〒108-0014
> 東京都港区芝5丁目29番20号 クロスオフィス三田
> TEL:03-6809-6502 FAX:03-6809-6504
>
> 【名古屋サテライト】
> 〒460-0002
> 愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
> TEL:052-971-0086
>
>

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by Chan Chor Pang <ch...@indetail.co.jp>.
not sure for ubuntu, but i think you can just create the file by yourself
the syntax will be the same as /etc/security/limits.conf

nproc.conf not only limit java process but all process by the same user

so even the jvm process does nothing,  if the corresponding user is busy 
in other way
the jvm process will still not able to create new thread.

btw the default limit for centos is 1024

On 10/31/16 9:51 AM, kant kodali wrote:
>
> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang 
> <chin-sh@indetail.co.jp <ma...@indetail.co.jp>> wrote:
>
>     /etc/security/limits.d/90-nproc.conf
>
>
> Hi,
>
> I am using Ubuntu 16.04 LTS. I have this directory 
> /etc/security/limits.d/ but I don't have any files underneath it. This 
> error happens after running for 4 to 5 hours. I wonder if this is a GC 
> issue? And I am thinking if I should use CMS. I have also posted this 
> on SO since I havent got much response for this question 
> http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native
>
>
> Thanks,
> kant

-- 
---*------------------------------------------------*---*---*---*---
\u682a\u5f0f\u4f1a\u793eINDETAIL
\u30cb\u30a2\u30b7\u30e7\u30a2\u7dcf\u5408\u30b5\u30fc\u30d3\u30b9\u4e8b\u696d\u672c\u90e8
\u30b2\u30fc\u30e0\u30b5\u30fc\u30d3\u30b9\u4e8b\u696d\u90e8
\u9673\u3000\u695a\u9d6c
E-mail :chin-sh@indetail.co.jp
URL : http://www.indetail.co.jp

\u3010\u672d\u5e4c\u672c\u793e\uff0fLABO\uff0fLABO2\u3011
\u3012060-0042
\u672d\u5e4c\u5e02\u4e2d\u592e\u533a\u5927\u901a\u897f9\u4e01\u76ee3\u756a\u573033
\u30ad\u30bf\u30b3\u30fc\u30bb\u30f3\u30bf\u30fc\u30d3\u30eb\u30c7\u30a3\u30f3\u30b0
\uff08\u672d\u5e4c\u672c\u793e\uff0fLABO2\uff1a2\u968e\u3001LABO\uff1a9\u968e\uff09
TEL\uff1a011-206-9235 FAX\uff1a011-206-9236

\u3010\u6771\u4eac\u652f\u5e97\u3011
\u3012108-0014
\u6771\u4eac\u90fd\u6e2f\u533a\u829d5\u4e01\u76ee29\u756a20\u53f7 \u30af\u30ed\u30b9\u30aa\u30d5\u30a3\u30b9\u4e09\u7530
TEL\uff1a03-6809-6502 FAX\uff1a03-6809-6504

\u3010\u540d\u53e4\u5c4b\u30b5\u30c6\u30e9\u30a4\u30c8\u3011
\u3012460-0002
\u611b\u77e5\u770c\u540d\u53e4\u5c4b\u5e02\u4e2d\u533a\u4e38\u306e\u51853\u4e01\u76ee17\u756a24\u53f7 NAYUTA BLD
TEL\uff1a052-971-0086


Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by kant kodali <ka...@gmail.com>.
On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <ch...@indetail.co.jp>
wrote:

> /etc/security/limits.d/90-nproc.conf
>

Hi,

I am using Ubuntu 16.04 LTS. I have this directory /etc/security/limits.d/
but I don't have any files underneath it. This error happens after running
for 4 to 5 hours. I wonder if this is a GC issue? And I am thinking if I
should use CMS. I have also posted this on SO since I havent got much
response for this question
http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native


Thanks,
kant

Re: java.lang.OutOfMemoryError: unable to create new native thread

Posted by Chan Chor Pang <ch...@indetail.co.jp>.
you may want to check the process limit of the user who responsible for 
starting the JVM.
/etc/security/limits.d/90-nproc.conf


On 10/29/16 4:47 AM, kant kodali wrote:
>  "dag-scheduler-event-loop" java.lang.OutOfMemoryError: unable to 
> create new native thread
>         at java.lang.Thread.start0(Native Method)
>         at java.lang.Thread.start(Thread.java:714)
>         at 
> scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
>         at 
> scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
>         at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.push(ForkJoinPool.java:1072)
>         at 
> scala.concurrent.forkjoin.ForkJoinTask.fork(ForkJoinTask.java:654)
>         at scala.collection.parallel.ForkJoinTasks$WrappedTask$
>
> This is the error produced by the Spark Driver program which is 
> running on client mode by default so some people say just increase the 
> heap size by passing the --driver-memory 3g flag however the message 
> *"**unable to create new native thread**"*  really says that the JVM 
> is asking OS to create a new thread but OS couldn't allocate it 
> anymore and the number of threads a JVM can create by requesting OS is 
> platform dependent but typically it is 32K threads on a 64-bit JVM. so 
> I am wondering why spark is even creating so many threads and how do I 
> control this number?