You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@httpd.apache.org by Lester Caine <le...@lsces.co.uk> on 2012/02/27 11:32:05 UTC

[users@httpd] Preferred way of limiting direct access

Spring cleaning time ...

I have a number of older dynamically built sites that are still using .htaccess 
to limit access to areas that only PHP needs to read. This is the simply way of 
doing things, but I am looking to current practice, and the performance hit may 
well be something I need to be concerned about?
What is the tidiest way to limit access via the <Directory> entry ( or do I need 
multiple entries? ) so that people can download content in the storage area, but 
only access the functions (PHP pages) in the various packages?

Each package has it's own directory under root, and some packages are only using 
'internally', but despite having some directories specifically blocked in the 
robots.txt file, they are still being trawled by search engines and I think I 
need to restore the .htaccess set-up in each?

Basically should I just 'deny all' on route and then open holes to the bits that 
need to be visible? The storage directory is easy, and .php in the accessible 
packages, but it's the .js, .css, icons and style elements that seem confusing. 
In practice, the basic structure of the project may be wrong for the way Apache 
now expects to work, but keeping everything related to a bundle like 'wiki' in 
one directory allows a modular approach which IS working well otherwise.

-- 
Lester Caine - G8HFL
-----------------------------
Contact - http://lsces.co.uk/wiki/?page=contact
L.S.Caine Electronic Services - http://lsces.co.uk
EnquirySolve - http://enquirysolve.com/
Model Engineers Digital Workshop - http://medw.co.uk//
Firebird - http://www.firebirdsql.org/index.php

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscribe@httpd.apache.org
   "   from the digest: users-digest-unsubscribe@httpd.apache.org
For additional commands, e-mail: users-help@httpd.apache.org


Re: [users@httpd] Preferred way of limiting direct access

Posted by Noel Butler <no...@ausics.net>.
On Mon, 2012-02-27 at 10:32 +0000, Lester Caine wrote:

> Spring cleaning time ...
> 
> I have a number of older dynamically built sites that are still using .htaccess 
> to limit access to areas that only PHP needs to read. This is the simply way of 
> doing things, but I am looking to current practice, and the performance hit may 
> well be something I need to be concerned about?


Best practice is using permissions in your main httpd.conf file, if you
have thousands of hosts it will be resource nicer.



> What is the tidiest way to limit access via the <Directory> entry ( or do I need 
> multiple entries? ) so that people can download content in the storage area, but 
> only access the functions (PHP pages) in the various packages?
> 
> Each package has it's own directory under root, and some packages are only using 
> 'internally', but despite having some directories specifically blocked in the 
> robots.txt file, they are still being trawled by search engines and I think I 
> need to restore the .htaccess set-up in each?
> 
> Basically should I just 'deny all' on route and then open holes to the bits that 
> need to be visible? The storage directory is easy, and .php in the accessible 
> packages, but it's the .js, .css, icons and style elements that seem confusing. 
> In practice, the basic structure of the project may be wrong for the way Apache 
> now expects to work, but keeping everything related to a bundle like 'wiki' in 
> one directory allows a modular approach which IS working well otherwise.
> 


Always deny everything by default then open up what you want accessed on
a global scale, something like wikis, forums and webmail scripts often
are not very nice for things like that, because they want to allow
access to top directories but deny sub dirs, so a .htaccess file would
be easier to manage, but if there's only a few dirs, I'd opt for
httpd.conf to be cleaner.

base:
<Directory />
    AllowOverride None
    Require all denied
</Directory>

allow to access the web server root holding virtual hosts:
<Directory "/var/www/hosts">
    AllowOverride None
    Require all granted
</Directory>

to protect each hosts admin section ,  a deny entry per domain

<Directory /var/www/hosts/example.com/admin>
      Require ip 10.10.0 127.0.0.1
      Auth* stuff
      Require valid-user
 </Directory>
<Directory /var/www/hosts/example.net/admin>
     Require ip 10.10.0 127.0.0.1
     Autrh* stuff
     Require valid-user
</Directory>
 ...etc...

Simple, but when you have stuff that has 20 subdirs and  wants to stop
you accessing inc css lib foo bar etc..., your httpd.conf becomes rather
large and maybe messy, so its easier to then use .htaccess to stop them
in those dirs

many bots also don't care about robots.txt files, even if you have 
User-agent: *
Disallow: /

That says dont traverse me, but it will always hit the root dir of the
URL to check i

Note: If using 2.0/2.2 replace require ip with order deny, allow ,
satisfy any..... you know the drill