You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lenya.apache.org by Doug Chestnut <ch...@apache.org> on 2005/06/29 18:22:36 UTC
robots.txt and google sitemap
Hi devs,
Thinking about how to generate a robots.txt file and a google sitemap
file for my site. I guess the easy impl would be to add an attribute to
the sitetree node that defined search priority (negative value would
signify that it shouldn't be indexed).
Seems like the better approach would be to make this part of the
documents metadata and then one could specify robots that shouldn't
access a document, and even block access (or set traps). Perhaps this
might be something to add to the AC Live page? Maybe this should be
split between metadata (search priority) and AC (exclude robots -
all|specific)?
WDYT?
Thanks,
--Doug
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@lenya.apache.org
For additional commands, e-mail: dev-help@lenya.apache.org