You are viewing a plain text version of this content. The canonical link for it is here.
Posted to modperl@perl.apache.org by Clinton Gormley <cl...@traveljury.com> on 2007/05/11 19:23:09 UTC
Re: After retrieving data from DB, the memory doesn't seem to be
freed up
> That's not really large data -- you're talking about dealing with
> 10-300k per request ( it should never go beyond that, because you'd
> be chunking stuff off the db for ease of download to the end user ).
>
> I've been under the impression ( and I'd imagine that others on this
> list are as well ) that you're talking about loading 10-100mb data
> structures for some sort of parsing or analysis -- which a lot of
> people here do. but you're talking about comparatively tiny amounts
> of data.
Agreed - and given the price of memory, it is a whole lot cheaper to use
some extra memory rather than building complicated micro-optimizations
to send the data out byte by byte.
Much easier to just program naturally, with considerations like the
following:
- use Apache2::SizeLimit to kill off a process if it gets too big
(but doesn't work under windows)
- force your child process to exit after serving the request if you
have to do something big (eg process a large image, generate a PDF)
Obviously, for the second case, I'm assuming that you would do these
things on a small percentage of your total requests, otherwise killing
off your child would be a major bottleneck.
Clint
>
Re: After retrieving data from DB, the memory doesn't seem to
befreed up
Posted by Clinton Gormley <cl...@traveljury.com>.
> >
> > Obviously, for the second case, I'm assuming that you would do these
> > things on a small percentage of your total requests, otherwise killing
> > off your child would be a major bottleneck.
>
> Here, as well, killing a child process under Windows, means killing my whole
> Apache server (and so all the serving threads).
I thought that $r->child_terminate would work under threaded MPMs as
well, but it doesn't.
My mistake.
clint
Re: After retrieving data from DB, the memory doesn't seem to befreed up
Posted by Lionel MARTIN <ma...@hotmail.com>.
Hi,
I think you got me wrong.
My initial question was basically something like "how could I preserve/give
back memory if needed" (i.e. in rare situations) and the reply turned up
into a "don't use large scalars". (which is relevant, I agree, but was not
directly replying my initial question)
> - use Apache2::SizeLimit to kill off a process if it gets too big
> (but doesn't work under windows)
Unfortunately, I am developing under Windows, so...
> - force your child process to exit after serving the request if you
> have to do something big (eg process a large image, generate a PDF)
>
> Obviously, for the second case, I'm assuming that you would do these
> things on a small percentage of your total requests, otherwise killing
> off your child would be a major bottleneck.
Here, as well, killing a child process under Windows, means killing my whole
Apache server (and so all the serving threads).
So, that's out of the equation.
Lionel.