You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@httpd.apache.org by Juan E Suris <ju...@rootn.com> on 2004/03/07 00:10:54 UTC

[users@httpd] httpd process using too much memory

Hi All!

I have searched through the documentation and can't really find a solution to my problem and I can't seem to find anybody that knows what to do.

My problem is that I have a perl cgi (on linux, apache 2.0) that generates very large files, that are sent to the browser directly form the perl script (using "Content-Dispositon: attachment",, etc..). This in turn causes the httpd process serving the request to use up alot of memory (about the size of the file being served). Is there any way to limit the amount of memory the process will use up? It seems like Apache does not block the output of the script, so that it will keep up with the speed the client can receive it, and therefore has to store it temporarily in memory(correct?). 

A solution that has been proposed is to write the file to the filesystem and redirect the request to that file. I am really trying to avoid doing this because the I/O generated by writing and reading the file to/from the filesystem will kill my app.

Thanks in advance for any help.
Juan

Re: [users@httpd] httpd process using too much memory

Posted by Juan E Suris <ju...@rootn.com>.
2.0.48... but I'm pretty sure that is not it, because I know ahead of time
the size of the file and set the Content-length header myself.
Juan

----- Original Message ----- 
From: "Joshua Slive" <jo...@slive.ca>
To: <us...@httpd.apache.org>
Sent: Saturday, March 06, 2004 9:09 PM
Subject: Re: [users@httpd] httpd process using too much memory


>
> On Sat, 6 Mar 2004, Nick Kew wrote:
> > > This in
> >  turn causes the httpd process serving the request to use up alot of
> >  memory (about the size of the file being served).
> >
> > It shouldn't do that unless you have an output filter that breaks
> > pipelining and buffers the entire document.  I'd suggest inserting
> > mod_diagnostics in the output chain to see if/where that's happening,
> > and try to identify the module responsible.  Your CGI might be able
> > to fix it by setting different headers.
>
> Also, he doesn't mention what version of apache 2.0 he's using.  Earlier
> versions would grab the entire contents in memory to set the
> Content-Length header.
>
> Joshua.
>
> ---------------------------------------------------------------------
> The official User-To-User support forum of the Apache HTTP Server Project.
> See <URL:http://httpd.apache.org/userslist.html> for more info.
> To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
>    "   from the digest: users-digest-unsubscribe@httpd.apache.org
> For additional commands, e-mail: users-help@httpd.apache.org
>
>


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] httpd process using too much memory

Posted by Joshua Slive <jo...@slive.ca>.
On Sat, 6 Mar 2004, Nick Kew wrote:
> >	 This in
>  turn causes the httpd process serving the request to use up alot of
>  memory (about the size of the file being served).
>
> It shouldn't do that unless you have an output filter that breaks
> pipelining and buffers the entire document.  I'd suggest inserting
> mod_diagnostics in the output chain to see if/where that's happening,
> and try to identify the module responsible.  Your CGI might be able
> to fix it by setting different headers.

Also, he doesn't mention what version of apache 2.0 he's using.  Earlier
versions would grab the entire contents in memory to set the
Content-Length header.

Joshua.

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] httpd process using too much memory

Posted by Juan E Suris <ju...@rootn.com>.
> On Sat, 6 Mar 2004, Juan E Suris wrote:
>
> > Hi All!
> >
> > My problem is that I have a perl cgi (on linux, apache 2.0) that
>   generates very large files, that are sent to the browser directly form
>   the perl script (using "Content-Dispositon: attachment",, etc..).
>
> Are you sure that's what you mean?
>
> That header is only meaningful in the context of a multipart MIME message.
> Is that really what you're sending to the browser?

Yes, that header is meant to force the browser to save the file instead of
opening it. In my case, I am sending a JPG, so instead of trying to display
the image, the browser will display the "Save As" dialog.

In any case, I see the behavior regardless of wheher I send that header or
not.

>
> > This in
>  turn causes the httpd process serving the request to use up alot of
>  memory (about the size of the file being served).
>
> It shouldn't do that unless you have an output filter that breaks
> pipelining and buffers the entire document.  I'd suggest inserting
> mod_diagnostics in the output chain to see if/where that's happening,
> and try to identify the module responsible.  Your CGI might be able
> to fix it by setting different headers.
>

Not sure how to do that, I'll look into it.
Thanks,
Juan


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] httpd process using too much memory

Posted by Nick Kew <ni...@webthing.com>.
On Sat, 6 Mar 2004, Juan E Suris wrote:

> Hi All!
>
> My problem is that I have a perl cgi (on linux, apache 2.0) that
  generates very large files, that are sent to the browser directly form
  the perl script (using "Content-Dispositon: attachment",, etc..).

Are you sure that's what you mean?

That header is only meaningful in the context of a multipart MIME message.
Is that really what you're sending to the browser?

>	 This in
 turn causes the httpd process serving the request to use up alot of
 memory (about the size of the file being served).

It shouldn't do that unless you have an output filter that breaks
pipelining and buffers the entire document.  I'd suggest inserting
mod_diagnostics in the output chain to see if/where that's happening,
and try to identify the module responsible.  Your CGI might be able
to fix it by setting different headers.

-- 
Nick Kew

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org