You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@httpd.apache.org by "Ronald F. Guilmette" <rf...@monkeys.com> on 2002/05/31 08:48:22 UTC

Re: Need a new feature: Listing of CGI-enabled directories.

In message <20...@cryptocracy.com>, 
Zac Stevens <zt...@cryptocracy.com> wrote:

>Let me preface this by noting that I agree with your goals, however I
>believe that you may not have considered the nature of the problem in
>sufficient depth.

I'll buy that.  I mean it would be arrogant of me... and possibly also
just plain wrong... not to admit the possibility that I have misjudged
the true nature of the problem.

>On Thu, May 30, 2002 at 07:45:42PM -0700, Ronald F. Guilmette wrote:
>> The first step in finding all such scripts however may often be the most
>> difficult one.  That first step consists of simply gathering into one
>> big list a list of all of the CGI-enabled directories on the local web
>> server.  Once such a list has been compiled, other standard UNIX tools
>> such as `find' and `file' and `grep' and be set to work, plowing through
>> all of the files in those (CGI) directories and finding all of the bad
>> FormMail scripts.
>
>How are you defining "bad FormMail scripts" here?

Spammer exploitable.

A more detailed elaboration of that term, as I use it, may be found here:

    http://www.monkeys.com/anti-spam/formmail-advisory.pdf

>In the simplest case,
>you could just run 'find' across the filesystem containing the web content
>looking for "formmail.cgi" or "formmail.pl"...

Hold on there a moment!

The object here is to do the search _efficiently_, i.e. such that it can
be done even by very large virtual web hosting companies, on all web
servers, and every night.

Searching the entire filesystem is out of the question.

> and checking those found
>against a list of known good/bad versions.  This doesn't require any
>specific knowledge of the Apache configuration in use, and is an entirely
>viable approach even on filesystems of several hundred gigabytes.

I believe that I disagree.

There are two separate problems with what you proposed.  First is the
fact that just searching for _filenames_ with the word "formmail" or
"FormMail" in them is not sufficient to find all of the bad Matt Wright
FormMail scripts that are installed on a given server.  End-lusers
often install the scripts using differet names, e.g. "form.pl" or
"mail.pl" or just "fm.pl".  The second problem is the notion of searching
several hunderd gigabytes of filesystem.  That just isn't a viable approach,
especially given that some of the parties I'm dealing with on this issue
are already balking, even at the notion of merely scanning _just_ their
CGI-enabled directories.

>A more thorough check would involve testing all executable ascii files,
>perhaps excluding .html/.php and variants thereof.

Yes, and that is what is needed.

Every plain file that has the execute bit set and that resides in any
directory for which ExecCGI is enabled must be checked (a) to see if
it is a Perl script and then (b) to see if it is a Matt Wright Perl
script.

>> But seriously, is there already a way to do what I need to do with the
>> Apache server?  Looking at the command line options on the man page for
>> httpd, there doesn't seem to be an option to request httpd to just parse
>> the config file, list the CGI directories, and then exit.  But that's
>> exactly what I need.
>
>It isn't possible in the generic case.  Apache allows many aspects of the
>configuration to be modified by the inclusion of ".htaccess" files beneath
>the DocumentRoot of the web server.  Unless Apache has been configured not
>to allow the ExecCGI option to be overridden, you would need to parse both
>httpd.conf AND every .htaccess file on the filesystem.  Apache itself does
>not do this at startup - it is done only as required.

Assume the simplifying assumption that enabling ExecCGI via .htaccess files
has been disallowed within the http.conf file.

_Now_ tell me how to get a list of top-level CGI-enabled directories out
of httpd... please.

>You also can't assume that only files in designated CGI directories are
>going to be problematic.

Actually, I believe that I can, under a certain set of conditions.

If you would like to help me flesh out what that exact set of conditions
is, then pby all means, please do.  I will appreciate it.

Understand that I am _not_ trying to build a solution to this searching
problem that will cover every possible contingency.  I will be satisfied
to build a solution that I can offer to web hosting companies and then
tell them ``This will work if you carefully restrict ExecCGI capability
by doing thus-and-such.''

That will be adequate for my purposes.

>There's a long history of people using all sorts
>of redirection and other techniques to access/execute things that they
>shouldn't be able to.

OK, so where is the ``best practices'' document for Apache that describes,
in detail, exactly what webmasters must do (e.g. in the httpd.conf files)
in order to avoid exactly the kind of ``unexpected execute permission''
problem that you have just mentioned?

If there is no such document, then maybe it is time somebody wrote one.
(I will volunteer to write it, if other folks will provide the necessary
technical input.)

The bottom line here is that while I accept what you say, i.e. that pro-
perly restricting ExecCGI can be tricky to accomplish, I also feel quite
sure that the Apache developer community has most probably _not_ said to
the Apache user community ``Don't even bother trying to restrict ExecCGI
capability, because you can't.  Anything in your whole filesystem can be
an executable CGI, no matter what you do.''

I don't accept that.  No webmaster in his right mind would ever accept
that.

There certainly must be ways to properly and completely restrict ExecCGI
capability to certain specific directories chosen only by the local web
administrator.  OK.  Fine.  Now I just need to write down what those ways
are.

Sounds simple enough to me.

>It should be apparent at this point that what you're looking at here is a
>reduction of the abuse of formmail & etc, as it is almost impossible to
>stamp out entirely.

See above.

I don't accept that proper CGI security is an impossibility.

I hate to use the trite catch-phrase, but I'm forced to do so in this
circumstance... ``If we can put a man on the moon...''

>In this particular example, the best solution is for the web host to
>replace the sendmail binary with something which performs more rigorous
>checking of its input and, ideally, tags outbound messages with identifying
>information.

You said earlier that I didn't fully understand the problem.

Now, unfortunately, I have to return the compliment.  I don't believe
that you understand the problem.

The problem isn't primarily that the versions of the Formmail script
that are installed by and approved by the local web administrators at
these sites are bad.  They aren't.  Those scripts are generally secure.
The problem is that big web hosting companies allow any one of their
clueless end-luser virtual web hosting customers to upload and install
_any_ arbitrary CGI script at any time into that customer's own private
CGI area.

_This_ is the source of most of these problems... end-lusers who are out
there doing their own thing and installing crappy scripts on their own
that the local web administrator would never approve of in a million
years *if* he ever even became aware that they had been uploaded and/or
installed.

Getting intelligent and experienced web administrators to Be Careful and
to Do The Right Thing is easy.  The hard part is dealing with all of
the problems caused by all of those zillions of clueless $9.95/month
virtual web hosting customers who are doing things and installing scripts
(e.g. at 3 AM) when the intelligent and responsible web server administra-
tor isn't even in the building.

I am working on solutions to address this latter problem.

>This is the solution implemented at a former employer...
>...
>Although the script was developed in-house (and, unfortunately, cannot be
>released) I believe there are open-source alternatives available to
>accomplish the same ends.

I have already developed a secure version of the FormMail script, and
I have already been distributing it for some time.  Also, the London
Perl Mongers group has been doing likewise.  They also distribute a
secure version of the script.

I expect intelligent web administrators to obtain one of these secure
versions of FormMail, if and when they need FormMail at all.  But again,
that is not the issue.  The issue is how to find the _insecure_ FormMail
scripts that those darn end-lusers have uploaded when nobody was watching
them.

>I hope these thoughts provide you with a different perspective to consider,
>and I wish you well in your efforts to reduce the amount of spam on the
>Internet.

Thank you.

Certainly your comments about the possible extension of ExecCGI capability
via .htaccess files are well taken.  That's something that I admit that
I hadn't known about, and didn't consider.  But it definitely needs to
be factored in to any comprehensive solution.

Nontheless, as I've noted above, I believe that there must certainly
be ways in which this potential problem can be adequately controlled
and restricted, and I just need to make sure that whatever solution
I offer or suggest to web administrators elsewhere includes some
appropriate statements about how to fully restrict ExecCGI capability
so that it is only extended to, and so that it can only be extended to
a fixed set of directories of the local administrator's choosing.


Regards,
Ron Guilmette

Re: [OT] Need a new feature: Listing of CGI-enabled directories.

Posted by Zac Stevens <zt...@cryptocracy.com>.
Hi Ron,

Thanks for your detailed response to my post, I'll reply later this
evening off list.

I do want to jump in on this, though..

On Fri, May 31, 2002 at 12:24:30AM -0700, Ronald F. Guilmette wrote:
> In the case of FormMail scripts, if the big web hosting companies can
> just scan all of their CGI directories for them every night and then
> simply `rm' or `chmod 0000' anything found by the scans of the previous
> night every morning, then that will be about 99.9% sufficent to eliminate
> the problem.

To provide you with a bit of context, my comments come after having run a
Solaris/Apache-based virtual hosting service in Australia for approximately
5000 sites on around 200GB of disk (on a Network Appliance filer.)  I'm
security conscious, but I'm also pragmatic - as you yourself seem to be
aware, commercial realities do put a stop to best practices some of the
time.

I have tried, and failed, in attempts to correct bad user behaviour in the
means you have described above.  In addition to removing execute
permissions and chown'ing files to root, I have attempted to leave messages 
by way of conspicuously-named files in the relevant directories.

None of this was met with much success.  Typically, the user would
re-upload the script, or delete and re-upload the entire site, and the
problem would begin anew.  Naturally this probably could have been stopped
with more fascist permissioning, but this really isn't the sort of thing
most companies are comfortable doing to their paying customers.

The problem is that none of these things alerts the user to the problem -
it just creates a new problem to grab their attention.  In sites where
valid contact e-mail addresses are available for every customer, this can
be a more effective form of resolving the issue.


In the aforementioned web hosting environment, we did two things to
minimise the amount of spam mail originating from CGI scripts on our
system.

1) Supplied a formmail equivalent as standard.  Our script added a few
   extra niceties, but essentially it was just a safe formmail.

The existance of this script and clear instructions on how to use it were
supplied to every customer when their site was setup.  The support staff
were trained in using the script and diagnosing problems.  Combined with
the fact that support was not offered for 3rd-party scripts (although they
were permitted), this resulted in a high usage rate for our custom script.

2) The sendmail binary was replaced with a script which did sanity checking
   and added identifying details.

This is a simple and extremely effective way to put an end to the problem
of spam mail originating from virtual-hosted customers.  Perhaps due to the
nature of our business (And the fact we supplied a safe CGI mailer), the
majority of our problems came from developers who had written their own PHP
or CGI scripts.  Many did not understand the need to write secure code, nor
knew how to.  By applying checks on the mail leaving the system, you catch
all the problem scripts - not just those you already know about.


> And that's a lot better than doing nothing.

Anything is better than nothing - it's an issue of getting the best result
from the most acceptable expenditure of effort.  Perhaps we merely differ
in our opinions of where that effort is best spent.

Cheers,


Zac

Re: Need a new feature: Listing of CGI-enabled directories.

Posted by Rasmus Lerdorf <ra...@apache.org>.
mod_info will tell you some of this.  ie. Look for ScriptAlias lines under
mod_alias.c and AddHandler cgi-script lines under mod_mime.c.

But you are fighting a bit of a lost cause here.  If you allow users to
upload arbitrary cgi scripts there really isn't much you can do at the
Apache level.  It becomes a system security issue.  ie. blocking outbound
port 25 connections, for example.

-Rasmus

On Thu, 30 May 2002, Ronald F. Guilmette wrote:

>
> In message <20...@cryptocracy.com>,
> Zac Stevens <zt...@cryptocracy.com> wrote:
>
> >Let me preface this by noting that I agree with your goals, however I
> >believe that you may not have considered the nature of the problem in
> >sufficient depth.
>
> I'll buy that.  I mean it would be arrogant of me... and possibly also
> just plain wrong... not to admit the possibility that I have misjudged
> the true nature of the problem.
>
> >On Thu, May 30, 2002 at 07:45:42PM -0700, Ronald F. Guilmette wrote:
> >> The first step in finding all such scripts however may often be the most
> >> difficult one.  That first step consists of simply gathering into one
> >> big list a list of all of the CGI-enabled directories on the local web
> >> server.  Once such a list has been compiled, other standard UNIX tools
> >> such as `find' and `file' and `grep' and be set to work, plowing through
> >> all of the files in those (CGI) directories and finding all of the bad
> >> FormMail scripts.
> >
> >How are you defining "bad FormMail scripts" here?
>
> Spammer exploitable.
>
> A more detailed elaboration of that term, as I use it, may be found here:
>
>     http://www.monkeys.com/anti-spam/formmail-advisory.pdf
>
> >In the simplest case,
> >you could just run 'find' across the filesystem containing the web content
> >looking for "formmail.cgi" or "formmail.pl"...
>
> Hold on there a moment!
>
> The object here is to do the search _efficiently_, i.e. such that it can
> be done even by very large virtual web hosting companies, on all web
> servers, and every night.
>
> Searching the entire filesystem is out of the question.
>
> > and checking those found
> >against a list of known good/bad versions.  This doesn't require any
> >specific knowledge of the Apache configuration in use, and is an entirely
> >viable approach even on filesystems of several hundred gigabytes.
>
> I believe that I disagree.
>
> There are two separate problems with what you proposed.  First is the
> fact that just searching for _filenames_ with the word "formmail" or
> "FormMail" in them is not sufficient to find all of the bad Matt Wright
> FormMail scripts that are installed on a given server.  End-lusers
> often install the scripts using differet names, e.g. "form.pl" or
> "mail.pl" or just "fm.pl".  The second problem is the notion of searching
> several hunderd gigabytes of filesystem.  That just isn't a viable approach,
> especially given that some of the parties I'm dealing with on this issue
> are already balking, even at the notion of merely scanning _just_ their
> CGI-enabled directories.
>
> >A more thorough check would involve testing all executable ascii files,
> >perhaps excluding .html/.php and variants thereof.
>
> Yes, and that is what is needed.
>
> Every plain file that has the execute bit set and that resides in any
> directory for which ExecCGI is enabled must be checked (a) to see if
> it is a Perl script and then (b) to see if it is a Matt Wright Perl
> script.
>
> >> But seriously, is there already a way to do what I need to do with the
> >> Apache server?  Looking at the command line options on the man page for
> >> httpd, there doesn't seem to be an option to request httpd to just parse
> >> the config file, list the CGI directories, and then exit.  But that's
> >> exactly what I need.
> >
> >It isn't possible in the generic case.  Apache allows many aspects of the
> >configuration to be modified by the inclusion of ".htaccess" files beneath
> >the DocumentRoot of the web server.  Unless Apache has been configured not
> >to allow the ExecCGI option to be overridden, you would need to parse both
> >httpd.conf AND every .htaccess file on the filesystem.  Apache itself does
> >not do this at startup - it is done only as required.
>
> Assume the simplifying assumption that enabling ExecCGI via .htaccess files
> has been disallowed within the http.conf file.
>
> _Now_ tell me how to get a list of top-level CGI-enabled directories out
> of httpd... please.
>
> >You also can't assume that only files in designated CGI directories are
> >going to be problematic.
>
> Actually, I believe that I can, under a certain set of conditions.
>
> If you would like to help me flesh out what that exact set of conditions
> is, then pby all means, please do.  I will appreciate it.
>
> Understand that I am _not_ trying to build a solution to this searching
> problem that will cover every possible contingency.  I will be satisfied
> to build a solution that I can offer to web hosting companies and then
> tell them ``This will work if you carefully restrict ExecCGI capability
> by doing thus-and-such.''
>
> That will be adequate for my purposes.
>
> >There's a long history of people using all sorts
> >of redirection and other techniques to access/execute things that they
> >shouldn't be able to.
>
> OK, so where is the ``best practices'' document for Apache that describes,
> in detail, exactly what webmasters must do (e.g. in the httpd.conf files)
> in order to avoid exactly the kind of ``unexpected execute permission''
> problem that you have just mentioned?
>
> If there is no such document, then maybe it is time somebody wrote one.
> (I will volunteer to write it, if other folks will provide the necessary
> technical input.)
>
> The bottom line here is that while I accept what you say, i.e. that pro-
> perly restricting ExecCGI can be tricky to accomplish, I also feel quite
> sure that the Apache developer community has most probably _not_ said to
> the Apache user community ``Don't even bother trying to restrict ExecCGI
> capability, because you can't.  Anything in your whole filesystem can be
> an executable CGI, no matter what you do.''
>
> I don't accept that.  No webmaster in his right mind would ever accept
> that.
>
> There certainly must be ways to properly and completely restrict ExecCGI
> capability to certain specific directories chosen only by the local web
> administrator.  OK.  Fine.  Now I just need to write down what those ways
> are.
>
> Sounds simple enough to me.
>
> >It should be apparent at this point that what you're looking at here is a
> >reduction of the abuse of formmail & etc, as it is almost impossible to
> >stamp out entirely.
>
> See above.
>
> I don't accept that proper CGI security is an impossibility.
>
> I hate to use the trite catch-phrase, but I'm forced to do so in this
> circumstance... ``If we can put a man on the moon...''
>
> >In this particular example, the best solution is for the web host to
> >replace the sendmail binary with something which performs more rigorous
> >checking of its input and, ideally, tags outbound messages with identifying
> >information.
>
> You said earlier that I didn't fully understand the problem.
>
> Now, unfortunately, I have to return the compliment.  I don't believe
> that you understand the problem.
>
> The problem isn't primarily that the versions of the Formmail script
> that are installed by and approved by the local web administrators at
> these sites are bad.  They aren't.  Those scripts are generally secure.
> The problem is that big web hosting companies allow any one of their
> clueless end-luser virtual web hosting customers to upload and install
> _any_ arbitrary CGI script at any time into that customer's own private
> CGI area.
>
> _This_ is the source of most of these problems... end-lusers who are out
> there doing their own thing and installing crappy scripts on their own
> that the local web administrator would never approve of in a million
> years *if* he ever even became aware that they had been uploaded and/or
> installed.
>
> Getting intelligent and experienced web administrators to Be Careful and
> to Do The Right Thing is easy.  The hard part is dealing with all of
> the problems caused by all of those zillions of clueless $9.95/month
> virtual web hosting customers who are doing things and installing scripts
> (e.g. at 3 AM) when the intelligent and responsible web server administra-
> tor isn't even in the building.
>
> I am working on solutions to address this latter problem.
>
> >This is the solution implemented at a former employer...
> >...
> >Although the script was developed in-house (and, unfortunately, cannot be
> >released) I believe there are open-source alternatives available to
> >accomplish the same ends.
>
> I have already developed a secure version of the FormMail script, and
> I have already been distributing it for some time.  Also, the London
> Perl Mongers group has been doing likewise.  They also distribute a
> secure version of the script.
>
> I expect intelligent web administrators to obtain one of these secure
> versions of FormMail, if and when they need FormMail at all.  But again,
> that is not the issue.  The issue is how to find the _insecure_ FormMail
> scripts that those darn end-lusers have uploaded when nobody was watching
> them.
>
> >I hope these thoughts provide you with a different perspective to consider,
> >and I wish you well in your efforts to reduce the amount of spam on the
> >Internet.
>
> Thank you.
>
> Certainly your comments about the possible extension of ExecCGI capability
> via .htaccess files are well taken.  That's something that I admit that
> I hadn't known about, and didn't consider.  But it definitely needs to
> be factored in to any comprehensive solution.
>
> Nontheless, as I've noted above, I believe that there must certainly
> be ways in which this potential problem can be adequately controlled
> and restricted, and I just need to make sure that whatever solution
> I offer or suggest to web administrators elsewhere includes some
> appropriate statements about how to fully restrict ExecCGI capability
> so that it is only extended to, and so that it can only be extended to
> a fixed set of directories of the local administrator's choosing.
>
>
> Regards,
> Ron Guilmette
>