User blog:Drek'TharSuperSword/My test wiki's robots.txt suddenly changed and it prevents search engines from indexing it...

This has been posted on the forum, but I don't think it's fitting for this to be put there.

So my test wiki's robots.txt suddenly changed itself to this.

User-agent: * Noindex: / Disallow: /

Which means that NO search engines CAN index my test wiki. Which is weird because robots.txt wasn't supposed to be able to be tampered at all. This happened recently, since prior to today (1/3), I could still search my site on Google just fine.

I've tried googling for my site but most of the time it wouldn't show up, and if it does show up then it'll say that the description isn't available due to the site's robots.txt

When compared to normal robots.txt from various wikis , you can clearly see that they're identical, and mine isn't. They only disallow some sites from being indexed, and NOT the entire wiki.

I don't know why this thing happened, but I highly suspect that Fandom started implementing a spam filter so that when it detects something fishy on the wiki it just replaces the whole thing so that it ISN'T indexed by Google. Or maybe Fandom was just testing out some new robots.txt but they forgot to revert it to a normal one. Who knows. I highly doubt it's hacked because it's completely pointless to hack such test sites.

I know that DTSS test wiki was supposed to be a pointless test site, but still. This makes me aware that something fishy's happening silently. I mean, there's no way you can change the robots.txt so I highly suspect it's been changed by a staff or something, maybe some automated software.

I've contacted Fandom staff for clarification as well as to make them revert my wiki's robots.txt to its normal state, because THIS normally wasn't supposed to happen at all.

If any other wikis' robots.txt has been changed to that, please tell me in the comments, because this is a very serious thing if it happened somewhere else, at it prevents the wiki from being indexed by search engines (thus making the wiki not showing up on Google). Fandom has to be REAL serious this time, by supplying us a robots.txt editor, or something that detects and automatically reverts robots.txt modification in case things like this happen again.