source: main/adopters/ut/branches/1.8/src/main/webapps/secure-view/robots.txt @ 3609

Last change on this file since 3609 was 3609, checked in by Garth Braithwaite, 10 years ago

ut ut secure view - committed ZW's secure view webapp. This version was what ZW and Scott worked on with the AGRC map being used for selection and final output. It also provides low level LHD selection security via a MySQL db table. THIS TIME FROM THE TEST SERVER NOT THE DEV SERVER.

File size: 711 bytes
1# Robots.txt file for domain:
2# According to a
3# the order should be most specific to least.
4# Also see:
6# Most validators say this is not standard but do not complain about
7# it.  This code came from Google's site and Google's validator
8# complains about it!!!
9# This, according to Google removes any url with a "?" in it.
10User-agent: Googlebot
11Disallow: /*?
13# All robots will spider the domain with static files and dynamic requests disallowed.
14User-agent: *
15Disallow: /artifact/
16Disallow: /css/
17Disallow: /image/
18Disallow: /js/
19Disallow: /jsp/
20Disallow: /xslt/
21Disallow: /view
22Disallow: /secure/
Note: See TracBrowser for help on using the repository browser.