Forum:Robots.txt

Hey! I recently found a page on the wikia I frequent called 'Robots.txt' with the content:
 * User-agent: *


 * Disallow: /

This was added by a user with a low edit count, who no longer edits with us.

I've never seen or noticed this on other wikias. I don't totally understand the explanations I found in my brief research, but according to wikipedia "Robots are often used by search engines to categorize and archive web sites". So disallowing them sounds like something we don't want, surely? I want the site to be found on search engines and so forth.

Could anyone advise, tell me why it is necessary, or relieve me from my ignorance? Thanks! --Grynd 02:07, April 26, 2010 (UTC)


 * It can be safely deleted, not necessary at all on a wiki. -- 03:17, April 26, 2010 (UTC)

I would like to know a little more about your robot findings please add more information-Cgorham1


 * It was just a regular wiki page that doesn't actually control anything, an actual robots.txt would be server-side, not controllable by users, and lowercase "r". -- 20:57, April 26, 2010 (UTC)