You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@httpd.apache.org by Kevin Cameron <ke...@kevincameron.net> on 2004/07/10 14:20:42 UTC

[users@httpd] Memory limits?

Hi I have been using Apache 1.3 with RedHat 7.2 for a while now, but am 
upgrading to RedHat ES with Apache 2.0 (actually the 
httpd-2.0.46-32.ent rpm that comes with RedHat ES).

I have found however that several CGI programs (all perl) which used to 
work fine no longer work durring certain routines. They simply hang and 
eventually time out. I have tracked the problem in some of the CGIs to 
find that if I cut out a certain line or loop that I presume uses more 
memory or time to execute than some other line, the program will go 
though to completion.  Likewise, on very simple pages, there is no 
problem executing the script until completion.

I am not receiving any errors, and I have apache configured the same as 
on the previous servers. I do not see any increase in memory usage with 
top either. The process simply hangs and times out.

Is there such a setting that is restricting my per-proccess memory 
usage? I have searched and searched for any clue, but can't find 
anyhting.

I am not certain if this is even an apache issue, but if it is, or if 
someone has experienced the same thing, I would appreciate some help.

Thanks
Kevin


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] Memory limits? / exit signal Segmentation fault (11)

Posted by Joe Orton <jo...@redhat.com>.
On Mon, Jul 12, 2004 at 12:19:41AM +0900, Kevin Cameron wrote:
> 
> On Jul 10, 2004, at 11:28 PM, Kevin Cameron wrote:
> 
> >>
> >>That would be a workaround. A fix would be to upgrade to 2.0.50.
> 
> Ok, so I tried upgrading from
> 
> httpd-2.0.46-32.ent rpm to
> httpd-2.0.50-1.0.i386.rpm
> 
> In the process it said I had to fill the requirements
> apr-0.9.4-2.src.rpm
> libapr0-2.0.50-1.i586.rpm

These look like SuSE packages and it's highly likely they won't work
correctly on RHEL3.  The RHEL3 test update packages available here
include the mod_cgi changes necessary to handle stderr correctly:

http://people.redhat.com/jorton/Taroon-httpd/

Completely remove any SuSE packages you've installed first before
installing these, "rpm --erase --nodeps libapr0 httpd".

Regads,

joe

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] Memory limits? / exit signal Segmentation fault (11)

Posted by Joshua Slive <js...@gmail.com>.
On Mon, 12 Jul 2004 00:19:41 +0900, Kevin Cameron
<ke...@kevincameron.net> wrote:

> Ok, so I tried upgrading from
> 
> httpd-2.0.46-32.ent rpm to
> httpd-2.0.50-1.0.i386.rpm

> [Mon Jul 12 00:03:39 2004] [notice] child pid 6887 exit signal
> Segmentation fault (11)
> 
> The only thing in there that looks like an error to me is the
> child pid 6887 exit signal Segmentation fault (11) line.  Do the other
> lines mean I have to fix somehting? or are they just informational
> messages?

The other messages are normal.

> 
> Is there something else i have to install with this version of apache?
> Or do I have to upgrade anything else also?
> I searched for the segment fault message in the archives, and it seems
> like every time someone had this problem, there was also a clue as to
> what else should be fixed, I don't see any in my case.

If I were you, I'd report this to the supplier of the rpm. 
Segmantation fault means "something really bad happened".  Sometimes
you get also get a core dump (see the CoreDumpDirectory directive)
which can pinpoint the source of the problem.  Otherwise, you are left
searching randomly (you can start by disabling all but the most basic
modules to try to identify if the problem is caused by a specific
problem).

Joshua.

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] Memory limits? / exit signal Segmentation fault (11)

Posted by Kevin Cameron <ke...@kevincameron.net>.
On Jul 10, 2004, at 11:28 PM, Kevin Cameron wrote:

>>
>> That would be a workaround. A fix would be to upgrade to 2.0.50.

Ok, so I tried upgrading from

httpd-2.0.46-32.ent rpm to
httpd-2.0.50-1.0.i386.rpm

In the process it said I had to fill the requirements
apr-0.9.4-2.src.rpm
libapr0-2.0.50-1.i586.rpm

I restarted apache ( /etc/rc.d/init.d/httpd restart )

And there were no errors. When I tried to view it through the browser 
however, nothing happened, and the error-log said this:

[Sun Jul 11 23:21:35 2004] [notice] caught SIGTERM, shutting down
[Mon Jul 12 00:03:23 2004] [notice] LDAP: Built with OpenLDAP LDAP SDK
[Mon Jul 12 00:03:23 2004] [notice] LDAP: SSL support unavailable
[Mon Jul 12 00:03:23 2004] [notice] suEXEC mechanism enabled (wrapper: 
/usr/sbin/suexec)
[Mon Jul 12 00:03:24 2004] [notice] Digest: generating secret for 
digest authentication ...
[Mon Jul 12 00:03:24 2004] [notice] Digest: done
[Mon Jul 12 00:03:24 2004] [notice] LDAP: Built with OpenLDAP LDAP SDK
[Mon Jul 12 00:03:24 2004] [notice] LDAP: SSL support unavailable
[Mon Jul 12 00:03:25 2004] [notice] Apache/2.0.50 (Fedora) configured 
-- resuming normal operations
[Mon Jul 12 00:03:39 2004] [notice] child pid 6887 exit signal 
Segmentation fault (11)

The only thing in there that looks like an error to me is the
child pid 6887 exit signal Segmentation fault (11) line.  Do the other 
lines mean I have to fix somehting? or are they just informational 
messages?

Is there something else i have to install with this version of apache? 
Or do I have to upgrade anything else also?
I searched for the segment fault message in the archives, and it seems 
like every time someone had this problem, there was also a clue as to 
what else should be fixed, I don't see any in my case.

Regards,
Kevin


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] Memory limits?

Posted by Kevin Cameron <ke...@kevincameron.net>.
>
> That would be a workaround. A fix would be to upgrade to 2.0.50.
>

Great! Thanks.

I'll try it!

Kevin


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] Memory limits?

Posted by André Malo <nd...@perlig.de>.
* Kevin Cameron <ke...@kevincameron.net> wrote:

> >
> > Are these scripts writing stuff to stderr (deliberately or 
> > accidentally)?
> >
> 
> Yes, they are writing to the httpd error log if that is what you mean.
> 
> I hope you are going to give me some really simple advice that will fix 
> the problem, like turn off warnings in the script. :)

That would be a workaround. A fix would be to upgrade to 2.0.50.

nd
-- 
"Das Verhalten von Gates hatte mir bewiesen, dass ich auf ihn und seine
beiden Gefährten nicht zu zählen brauchte" -- Karl May, "Winnetou III"

Im Westen was neues: <http://pub.perlig.de/books.html#apache2>

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] Memory limits?

Posted by Kevin Cameron <ke...@kevincameron.net>.
>
> Are these scripts writing stuff to stderr (deliberately or 
> accidentally)?
>

Yes, they are writing to the httpd error log if that is what you mean.

I hope you are going to give me some really simple advice that will fix 
the problem, like turn off warnings in the script. :)

Kevin


---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] Memory limits?

Posted by Joshua Slive <js...@gmail.com>.
On Sat, 10 Jul 2004 21:20:42 +0900, Kevin Cameron
<ke...@kevincameron.net> wrote:
> Hi I have been using Apache 1.3 with RedHat 7.2 for a while now, but am
> upgrading to RedHat ES with Apache 2.0 (actually the
> httpd-2.0.46-32.ent rpm that comes with RedHat ES).
> 
> I have found however that several CGI programs (all perl) which used to
> work fine no longer work durring certain routines. They simply hang and
> eventually time out. I have tracked the problem in some of the CGIs to
> find that if I cut out a certain line or loop that I presume uses more
> memory or time to execute than some other line, the program will go
> though to completion.  Likewise, on very simple pages, there is no
> problem executing the script until completion.

Are these scripts writing stuff to stderr (deliberately or accidentally)?

Joshua.

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org