« Return to the blog entry
More specifically you would probably want to just prevent crawling of the DB which has moved. The content of the robots.txt would then be like:
User-Agent: *
Disallow: /database.nsf/
# Allow ALL others
User-agent: *
Disallow:
More specifically you would probably want to just prevent crawling of the DB which has moved. The content of the robots.txt would then be like:
User-Agent: *
Disallow: /database.nsf/
# Allow ALL others
User-agent: *
Disallow: