I do not have robots.txt files on any of my sites. However, over the past week, Webmaster's Tools reports that gbots didn't crawl the site because they were unable to access robots.txt. Most times, they failed on all of several attempts, but once they succeeded in 1 of 7 tries. Anyone seen this kind of thing? The e-commerce site discussed on an earlier thread is one of the ones most affected - could there be a connection between the two anomalies?