# sample robots.txt file for Jigsaw User-agent: * Disallow: /guest-demos/ Disallow: /status/ Disallow: /demos/ Disallow: /HyperNews/ Disallow: /cgi-bin/ Disallow: /css-validator/docs/ Disallow: /Friends/ Disallow: /api/ Disallow: /Benoit/Public/DVDDB/ Disallow: /css-validator/validator Disallow: /css-validator/check # stupid bots! a 404 is a 404!... I have to disallow non # existent directories :(