[whatwg] Proposal: Exclude robots on a per-link basis
Kornel Lesiński
kornel at geekhood.net
Sat Nov 26 06:26:42 PST 2011
On Sat, 26 Nov 2011 12:20:28 -0000, Markus Ernst <derernst at gmx.ch> wrote:
> Viewing the logs of applications I wrote, I noticed that a considerable
> number of requests are from robots following links of types such as "Add
> to shopping cart" or "Remember this item" - links that typically point
> to the same page they are clicked on, with some GET variable that
> triggers an action on the server.
Actions that have significant side effects, like "Remember this item",
should be performed using POST method, which well-behaved bots do not
execute.
Exclusion of URLs based on on query string arguments could be done with
wildcards in robots.txt (Googlebot supports this extension already):
Disallow: *add-item=*
--
regards, Kornel Lesiński
More information about the whatwg
mailing list