You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@apex.apache.org by "Kottapalli, Venkatesh" <VK...@DIRECTV.com> on 2016/03/07 09:25:07 UTC

Reg. files handled soft limit set in the application

Hi,

                Is there a limit set by DT application by default on the number of files the application is working on? If so, is there a way to increase the soft limit set?

-Venkatesh.


Re: Reg. files handled soft limit set in the application

Posted by Priyanka Gugale <pr...@datatorrent.com>.
Hi Kottapalli,

There is no such limit by platform, but many times operator puts some
limits in order to resolve issues related to memory usage, working with
slower downstream operator, putting bandwidth limitation etc. If you can
elaborate on what operator you are using, may be we can help you locate
right properties.

-Priyanka

On Mon, Mar 7, 2016 at 1:55 PM, Kottapalli, Venkatesh <
VKottapalli@directv.com> wrote:

> Hi,
>
>                 Is there a limit set by DT application by default on the
> number of files the application is working on? If so, is there a way to
> increase the soft limit set?
>
> -Venkatesh.
>
>

RE: Reg. files handled soft limit set in the application

Posted by "Kottapalli, Venkatesh" <VK...@DIRECTV.com>.
Thank You Sandeep. 

-Venkatesh.

-----Original Message-----
From: Sandeep Deshmukh [mailto:sandeep@datatorrent.com] 
Sent: Wednesday, March 16, 2016 11:39 PM
To: dev
Subject: Re: Reg. files handled soft limit set in the application

There is no limit as such from Apex but whatever applies to Hadoop will still be applicable here. So, you may have to do some tweaks in Hadoop.

https://wiki.apache.org/hadoop/TooManyOpenFiles

http://askubuntu.com/questions/162345/how-to-increase-open-file-limits-nofile-and-epoll-in-10-04

Regards,
Sandeep

Regards,
Sandeep

On Thu, Mar 17, 2016 at 11:24 AM, Kottapalli, Venkatesh < VKottapalli@directv.com> wrote:

> Hi,
>
> App Master fails with the below exception. The upper limit on the 
> system is 1024. Could you please suggest what could be the possible 
> cause. There are no container failures from what I see in the 
> application and not sure why it is opening those many files.
>
> 2016-03-16 23:34:46,269 [943875111@qtp-1149942716-31] FATAL 
> conf.Configuration loadResource - error parsing conf core-site.xml
> java.io.FileNotFoundException:
> /var/run/cloudera-scm-agent/process/3149-yarn-NODEMANAGER/core-site.xm
> l
> (Too many open files)
>         at java.io.FileInputStream.open(Native Method)
>         at java.io.FileInputStream.<init>(FileInputStream.java:146)
>         at java.io.FileInputStream.<init>(FileInputStream.java:101)
>         at
> sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
>         at
> sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
>         at java.net.URL.openStream(URL.java:1037)
>         at
> org.apache.hadoop.conf.Configuration.parse(Configuration.java:2378)
>         at
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2449)
>         at
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2402)
>         at
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2319)
>         at
> org.apache.hadoop.conf.Configuration.get(Configuration.java:1146)
>         at
> com.datatorrent.stram.util.ConfigUtils.getSchemePrefix(ConfigUtils.java:73)
>         at
> com.datatorrent.stram.StreamingContainerManager.getAppMasterContainerInfo(StreamingContainerManager.java:418)
>         at
> com.datatorrent.stram.webapp.StramWebServices.listContainers(StramWebServices.java:442)
>         at sun.reflect.GeneratedMethodAccessor95.invoke(Unknown Source)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMet
> hodInvokerFactory.java:60)
>
> -Venkatesh.
>
>
>
>
> -----Original Message-----
> From: Tushar Gosavi [mailto:tushar@datatorrent.com]
> Sent: Monday, March 07, 2016 8:18 AM
> To: dev@apex.incubator.apache.org
> Subject: Re: Reg. files handled soft limit set in the application
>
> When running application with lots of physical operators, Application 
> master went down as it was not able to open any new connection because 
> of limit on file handles. Do we open connection per containers or per 
> operator partition. Also there were some failure, may be connection 
> close is not happening when some containers are failing, (RPC timeout 
> is set to high
> value) The user have soft limit set to 1024 and hard limit set to 4096 
> which is also low.
>
> Is there any way (apex configuration property) for running container 
> with increased soft limit. Or use need tos change system configuration 
> to allow more open files per process.
>
> - Tushar.
>
>
> On Mon, Mar 7, 2016 at 9:17 PM, Munagala Ramanath 
> <ra...@datatorrent.com>
> wrote:
>
> > *sysctl fs.file-max*
> > should show you the kernel limit.
> >
> > *ulimit -n*
> > shows the per-user limit
> >
> > You can see the list of open files used by a process with (where 
> > <pid> is the process id):
> > *ls -l /proc/<pid>/fd*
> >
> > You can also use the *lsof* command described here:
> > http://www.thegeekstuff.com/2012/08/lsof-command-examples/
> >
> > Are you running into the limit ? Can you share some details of the 
> > error you're seeing ?
> >
> > Ram
> >
> > On Mon, Mar 7, 2016 at 12:25 AM, Kottapalli, Venkatesh < 
> > VKottapalli@directv.com> wrote:
> >
> > > Hi,
> > >
> > >                 Is there a limit set by DT application by default 
> > > on the number of files the application is working on? If so, is 
> > > there a way to increase the soft limit set?
> > >
> > > -Venkatesh.
> > >
> > >
> >
>

Re: Reg. files handled soft limit set in the application

Posted by Sandeep Deshmukh <sa...@datatorrent.com>.
There is no limit as such from Apex but whatever applies to Hadoop will
still be applicable here. So, you may have to do some tweaks in Hadoop.

https://wiki.apache.org/hadoop/TooManyOpenFiles

http://askubuntu.com/questions/162345/how-to-increase-open-file-limits-nofile-and-epoll-in-10-04

Regards,
Sandeep

Regards,
Sandeep

On Thu, Mar 17, 2016 at 11:24 AM, Kottapalli, Venkatesh <
VKottapalli@directv.com> wrote:

> Hi,
>
> App Master fails with the below exception. The upper limit on the system
> is 1024. Could you please suggest what could be the possible cause. There
> are no container failures from what I see in the application and not sure
> why it is opening those many files.
>
> 2016-03-16 23:34:46,269 [943875111@qtp-1149942716-31] FATAL
> conf.Configuration loadResource - error parsing conf core-site.xml
> java.io.FileNotFoundException:
> /var/run/cloudera-scm-agent/process/3149-yarn-NODEMANAGER/core-site.xml
> (Too many open files)
>         at java.io.FileInputStream.open(Native Method)
>         at java.io.FileInputStream.<init>(FileInputStream.java:146)
>         at java.io.FileInputStream.<init>(FileInputStream.java:101)
>         at
> sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
>         at
> sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
>         at java.net.URL.openStream(URL.java:1037)
>         at
> org.apache.hadoop.conf.Configuration.parse(Configuration.java:2378)
>         at
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2449)
>         at
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2402)
>         at
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2319)
>         at
> org.apache.hadoop.conf.Configuration.get(Configuration.java:1146)
>         at
> com.datatorrent.stram.util.ConfigUtils.getSchemePrefix(ConfigUtils.java:73)
>         at
> com.datatorrent.stram.StreamingContainerManager.getAppMasterContainerInfo(StreamingContainerManager.java:418)
>         at
> com.datatorrent.stram.webapp.StramWebServices.listContainers(StramWebServices.java:442)
>         at sun.reflect.GeneratedMethodAccessor95.invoke(Unknown Source)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
>
> -Venkatesh.
>
>
>
>
> -----Original Message-----
> From: Tushar Gosavi [mailto:tushar@datatorrent.com]
> Sent: Monday, March 07, 2016 8:18 AM
> To: dev@apex.incubator.apache.org
> Subject: Re: Reg. files handled soft limit set in the application
>
> When running application with lots of physical operators, Application
> master went down as it was not able to open any new connection because of
> limit on file handles. Do we open connection per containers or per operator
> partition. Also there were some failure, may be connection close is not
> happening when some containers are failing, (RPC timeout is set to high
> value) The user have soft limit set to 1024 and hard limit set to 4096
> which is also low.
>
> Is there any way (apex configuration property) for running container with
> increased soft limit. Or use need tos change system configuration to allow
> more open files per process.
>
> - Tushar.
>
>
> On Mon, Mar 7, 2016 at 9:17 PM, Munagala Ramanath <ra...@datatorrent.com>
> wrote:
>
> > *sysctl fs.file-max*
> > should show you the kernel limit.
> >
> > *ulimit -n*
> > shows the per-user limit
> >
> > You can see the list of open files used by a process with (where <pid>
> > is the process id):
> > *ls -l /proc/<pid>/fd*
> >
> > You can also use the *lsof* command described here:
> > http://www.thegeekstuff.com/2012/08/lsof-command-examples/
> >
> > Are you running into the limit ? Can you share some details of the
> > error you're seeing ?
> >
> > Ram
> >
> > On Mon, Mar 7, 2016 at 12:25 AM, Kottapalli, Venkatesh <
> > VKottapalli@directv.com> wrote:
> >
> > > Hi,
> > >
> > >                 Is there a limit set by DT application by default on
> > > the number of files the application is working on? If so, is there a
> > > way to increase the soft limit set?
> > >
> > > -Venkatesh.
> > >
> > >
> >
>

RE: Reg. files handled soft limit set in the application

Posted by "Kottapalli, Venkatesh" <VK...@DIRECTV.com>.
Hi,

App Master fails with the below exception. The upper limit on the system is 1024. Could you please suggest what could be the possible cause. There are no container failures from what I see in the application and not sure why it is opening those many files.

2016-03-16 23:34:46,269 [943875111@qtp-1149942716-31] FATAL conf.Configuration loadResource - error parsing conf core-site.xml
java.io.FileNotFoundException: /var/run/cloudera-scm-agent/process/3149-yarn-NODEMANAGER/core-site.xml (Too many open files)
	at java.io.FileInputStream.open(Native Method)
	at java.io.FileInputStream.<init>(FileInputStream.java:146)
	at java.io.FileInputStream.<init>(FileInputStream.java:101)
	at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
	at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
	at java.net.URL.openStream(URL.java:1037)
	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2378)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2449)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2402)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2319)
	at org.apache.hadoop.conf.Configuration.get(Configuration.java:1146)
	at com.datatorrent.stram.util.ConfigUtils.getSchemePrefix(ConfigUtils.java:73)
	at com.datatorrent.stram.StreamingContainerManager.getAppMasterContainerInfo(StreamingContainerManager.java:418)
	at com.datatorrent.stram.webapp.StramWebServices.listContainers(StramWebServices.java:442)
	at sun.reflect.GeneratedMethodAccessor95.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)

-Venkatesh.




-----Original Message-----
From: Tushar Gosavi [mailto:tushar@datatorrent.com] 
Sent: Monday, March 07, 2016 8:18 AM
To: dev@apex.incubator.apache.org
Subject: Re: Reg. files handled soft limit set in the application

When running application with lots of physical operators, Application master went down as it was not able to open any new connection because of limit on file handles. Do we open connection per containers or per operator partition. Also there were some failure, may be connection close is not happening when some containers are failing, (RPC timeout is set to high
value) The user have soft limit set to 1024 and hard limit set to 4096 which is also low.

Is there any way (apex configuration property) for running container with increased soft limit. Or use need tos change system configuration to allow more open files per process.

- Tushar.


On Mon, Mar 7, 2016 at 9:17 PM, Munagala Ramanath <ra...@datatorrent.com>
wrote:

> *sysctl fs.file-max*
> should show you the kernel limit.
>
> *ulimit -n*
> shows the per-user limit
>
> You can see the list of open files used by a process with (where <pid> 
> is the process id):
> *ls -l /proc/<pid>/fd*
>
> You can also use the *lsof* command described here:
> http://www.thegeekstuff.com/2012/08/lsof-command-examples/
>
> Are you running into the limit ? Can you share some details of the 
> error you're seeing ?
>
> Ram
>
> On Mon, Mar 7, 2016 at 12:25 AM, Kottapalli, Venkatesh < 
> VKottapalli@directv.com> wrote:
>
> > Hi,
> >
> >                 Is there a limit set by DT application by default on 
> > the number of files the application is working on? If so, is there a 
> > way to increase the soft limit set?
> >
> > -Venkatesh.
> >
> >
>

Re: Reg. files handled soft limit set in the application

Posted by Tushar Gosavi <tu...@datatorrent.com>.
When running application with lots of physical operators, Application
master went down as it was not able to open any new connection because of
limit on file handles. Do we open connection per containers or per operator
partition. Also there were some failure, may be connection close is not
happening when some containers are failing, (RPC timeout is set to high
value) The user have soft limit set to 1024 and hard limit set to 4096
which is also low.

Is there any way (apex configuration property) for running container with
increased soft limit. Or use need tos change system configuration to allow
more open files per process.

- Tushar.


On Mon, Mar 7, 2016 at 9:17 PM, Munagala Ramanath <ra...@datatorrent.com>
wrote:

> *sysctl fs.file-max*
> should show you the kernel limit.
>
> *ulimit -n*
> shows the per-user limit
>
> You can see the list of open files used by a process with (where <pid> is
> the process id):
> *ls -l /proc/<pid>/fd*
>
> You can also use the *lsof* command described here:
> http://www.thegeekstuff.com/2012/08/lsof-command-examples/
>
> Are you running into the limit ? Can you share some details of the error
> you're seeing ?
>
> Ram
>
> On Mon, Mar 7, 2016 at 12:25 AM, Kottapalli, Venkatesh <
> VKottapalli@directv.com> wrote:
>
> > Hi,
> >
> >                 Is there a limit set by DT application by default on the
> > number of files the application is working on? If so, is there a way to
> > increase the soft limit set?
> >
> > -Venkatesh.
> >
> >
>

Re: Reg. files handled soft limit set in the application

Posted by Munagala Ramanath <ra...@datatorrent.com>.
*sysctl fs.file-max*
should show you the kernel limit.

*ulimit -n*
shows the per-user limit

You can see the list of open files used by a process with (where <pid> is
the process id):
*ls -l /proc/<pid>/fd*

You can also use the *lsof* command described here:
http://www.thegeekstuff.com/2012/08/lsof-command-examples/

Are you running into the limit ? Can you share some details of the error
you're seeing ?

Ram

On Mon, Mar 7, 2016 at 12:25 AM, Kottapalli, Venkatesh <
VKottapalli@directv.com> wrote:

> Hi,
>
>                 Is there a limit set by DT application by default on the
> number of files the application is working on? If so, is there a way to
> increase the soft limit set?
>
> -Venkatesh.
>
>