Board Thread:General Discussion/@comment-5079138-20170102223919

So my test wiki's robots.txt suddenly changed itself to this.

User-agent: * Noindex: / Disallow: /

Which means that NO search engines SHOULD index my test wiki. Which is weird because robots.txt wasn't supposed to be able to be tampered at all.

Compare to a normal robots.txt from here, which only excludes certain features from being indexed.  

I don't know why this thing happened, but I highly suspect that Wikia started implementing a spam filter so that when it detects something fishy on the wiki it just replaces the whole thing so that it ISN'T indexed by Google.

I know that dtss-test was supposed to be a pointless test site, but still. This makes me aware that something fishy's happening silently. I mean, there's no way you can change the robots.txt so I highly suspect it's been changed by a staff or something.

I've contacted Wikia staff just for clarification about this because THIS normally wasn't supposed to happen at all.

Not sure if anyone else got this problem before, but maybe it was just my test wiki. 