[SAC] robots.txt


I and others have observed trac.osgeo.org being slow lately. I dug a wee
bit and I see a lot of spidering activity particularly on the openlayers
trac. I checked robots.txt and it seems whoever setup the OpenLayers Trac
instance did not add the robots.txt entries per our procedures. I have added
them - hopefully this will help a bit.


I encourage folks to keep an eye on Trac to see how it is performing.

Best regards,
I set the clouds in motion - turn up | Frank Warmerdam, warmerdam@pobox.com
light and sound - activate the windows | http://pobox.com/warmerda
and watch the world go round - Rush | Geospatial Software Developer