You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@pagespeed.apache.org by GitBox <gi...@apache.org> on 2018/11/01 09:34:49 UTC

[GitHub] AlexeyKosov opened a new issue #1604: Disable pagespeed optimization for specific user agent

AlexeyKosov opened a new issue #1604: Disable pagespeed optimization for specific user agent
URL: https://github.com/apache/incubator-pagespeed-ngx/issues/1604
 
 
   Is it possible to disable some or all optimizations for a certain user agent (crawler)? When it crawls the site, CPU usage for the nginx process is getting too high primarily because of the pagespeed module.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

Re: [GitHub] AlexeyKosov opened a new issue #1604: Disable pagespeed optimization for specific user agent

Posted by Nick Kew <ni...@apache.org>.
>   Is it possible to disable some or all optimizations for a certain user agent (crawler)? When it crawls the site, CPU usage for the nginx process is getting too high primarily because of the pagespeed module.

That would be for your general configuration.  Pagespeed is a module,
not a complete replacement for your server!

If a crawler is so ill-behaved as to flood your server, I'd simply ban it.
Have the server return an instant 400.  Rules for well-behaved robots
have been perfectly clear for more than twenty years.

@pagespeed - this isn't even a development issue, yet like everything else
it arrived through github.  Is there really anything happening at apache, as
opposed to being a purely github project?

-- 
Nick Kew