Community Central
Community Central
Forums: Index Help desk Spam hurdle too much
Fandom's forums are a place for the community to help other members.
To contact staff directly or to report bugs, please use Special:Contact.
Archive
Note: This topic has been unedited for 3211 days. It is considered archived - the discussion is over. Information in this thread may be out of date. Do not add to unless it really needs a response.
Notes
  • The captchas should only trigger when you are adding new external urls (eg http://somelink.com or [http://somelink.com somelink]) or if you are creating a new user login.
  • The captchas should not affect sysops or flagged bots. If you find they are, please logout and log back in to refresh your cookies.
  • The captchas are being tested in simple (math formula) mode, but the intention is to have graphical captchas working soon, so please do not hastily change the wording on the captcha system messages.


Help:Captcha

Every time I save an external link on the Psychology Wiki I'm now having to fill in a form! This is too much we have thousands of links to put up and this is just getting in the way. Can someone turn it off for us. The problem of automated spam is not that great to warrant the work involved. It may be a necessary strategy if we have a problem, but at the moment it is just getting in the way. On a side issue the slowness of the site is an increasing problem, on several occasions now I have given up because of the time its taking to save altered pages. I understand from Sannse this is being fixed. This has added to the frustration of the form filling, as it means saving a page twice!!! I guess you are just experimenting and trying things out which is fair enough, but we need better communications so we know what is going on and for how long the experiment will lastDr Joe Kiff - User:Lifeartist (talk) 22:49, 8 September 2006 (UTC)

It goes on the wikis other than central to? then Same for the Gaiapedia. Cause we have no issues with automation, I am the only one arround our wiki who even knows how to use a bot, and if someone does, I'll just block the automated spammer, and do a simple revert. And this feature could actualy harm good uses of bots. I use bots to do repetative tasks on our wiki, if one of my bots collides with this system, then this feature could break how fast I can fix things. I'm hoping that this feature does not stop my GaiaplotBot from working, cause I use that to get information from the wiki we have partnered with. Dantman (Talk) 23:10, 8 September 2006 (UTC)
Damnit... Yes, these new features break the bot we use to grab current events from our partner the plotwiki. Not only that, but it is possible that these new features will also break the bots that will be used for item migration when we get our new namespace. We can't move over 100 articles, change all the links allover the wiki, then delete the redirects all by hand having to contend with solving a math equation every time we want to edit a page. Dantman (Talk)
Is it possible to keep this anti-spam measure on new account creation only, but allow registered users/registered bots to add external links without having to do math or read images? —Silly Dan (talk) 00:09, 9 September 2006 (UTC)
The problem of automated spam is quite large on wikia overall. The Spam Blacklist is getting to the point it starts to slow down the system. Just protecting against new account creation is hardly worth it, because one captcha-intervention by a human allows a spambot access to almost 2000 wikia, which only stops when they get blacklisted. Captcha is a well established tool (especially on wikimedia, where many servers without 24/7 patrol have it enabled) for slowing down spambots almost to the point of flaccidity. I don't know if this is currently in place for page moves, but it seems like it would be useful for that too.
That said, I just noticed this extension here myself and don't know the technical aspects nor policy behind it yet. Because of the way extensions are applied globally, an opt-out might not be very easy, but local whitelists might be possible. It makes sense to keep it for blankings, page moves and url-additions by non-sysops, to show the vandals and spammers that wikia is not a very fun target... but that is just my opinion. --Splarka (talk) 01:05, 9 September 2006 (UTC)

Update: re "And this feature could actualy harm good uses of bots."... Just tested it with Sannse: Having the flags: bot, sysop, or staff should make you immune to the captcha. Any trusted bot can be set +bot by a staff. Note: You may have to logout and login for it to skip you. --Splarka (talk) 08:23, 9 September 2006 (UTC)

I am having trouble passing the CAPTCHA

I did an experiment linking Wikia:Sandbox to http://www.google.com, but I had to answer the CAPTCHA about six times before I could save the page. Apparently I do not know how to answer the CAPTCHA correctly (why did it eventually let me through?), or the wiki does not understand me. You need to replace it with something that human editors like me can understand. --Kernigh 01:33, 9 September 2006 (UTC)

Because it always presents an arithmetic problem, maybe I should try to evaluate it. The instructions do not say to solve the problem; they say to type the text in the image, but this might explain why it let me in on "+ 0" queries. (I had been wondering whether to type the = sign in my answer!) Let http://www.microsoft.com trigger the CAPTCHA. --Kernigh 01:42, 9 September 2006 (UTC)

Right now it is in Simple mode, which is math. The Fancy mode (generally used) generates images with semiobscured writing that you type in. To see that mode in action, try adding a link at m:Meta:Sandbox. --Splarka (talk) 03:37, 9 September 2006 (UTC)
You know that since the equations are text, someone could possibly modify a bot framework to grab the equation, evaluate it, then enter the value rendering the system useless. But that system realy causes bot issues, why not make it so that registered bots and sysops are not effected by the system. That way proper bots are not killed off by the system. As for the sysop comment, that's because I have a sysop account that I use for mass page deletion, which is basicly for when I half to remove old item pages after the item migration. But there is no spam problem on our wiki, I wish it was possible to remove it for individual wikis because this could destroy the advancement our wiki is trying to make, and I like the fact that here on wikia searching for Gaiapedia gives us as the first result, and we don't half to worry about domains. And I don't know where we would go if wikia stoped us from advancing. Dantman 04:41, 9 September 2006 (UTC)
I was just thinking that too. It wouldn't be all that hard to dump the equation into a temporary gif file and show that instead, to avoid simple auto-parsing and solving of the equation, would it? Steve and Jock 12:45, 10 September 2006 (UTC)

The point is its making it unusable for me

I've got so fed up with it I've been put off working on the wiki. Its so slow and frustrating, particularly when I cant be bothered to concentrate on it{because Im thinking ahead} and add rather than subtract. It just slows everything up and is so painfully slow it takes the enjoment out of it. Please please take it down, or give us an opt out or something. It is not a viable solution to a problem that we dont have specifcally. What about only doing it every thiry changes and just make it add single digits. Hot and botheredDr Joe Kiff - User:Lifeartist (talk) 19:50, 9 September 2006 (UTC)

Did you try logging out and then back in again? --Splarka (talk) 09:29, 10 September 2006 (UTC)
Thanks as always Sparkla. Im free again!!!!Dr Joe Kiff - User:Lifeartist (talk) 12:07, 11 September 2006 (UTC)
I thought this thing only functioned if you added an external link... how many times do you do that? If it's too many, you're not a good editor! ZPMMaker 12:48, 10 September 2006 (UTC)
Taking a look at the Psychology wiki, it seems to frequently link to other scholarly resources in order to provide serious, verifiable information — so by adding lots of external links, he's being a very good editor. (This is in contrast to the Star Wars wiki I usually edit, which sources articles primarily by linking to our articles on an issue of a comic-book spinoff or something like that.) Is there no way that the captcha could be applied differently on different wikis? I can see the use of the full extension being very helpful on Wookieepedia, but the Psychology wiki might benefit more if it was only applied to account creation. I suspect this isn't possible, though. —Silly Dan (talk) 13:26, 10 September 2006 (UTC)
Not so silly Dan! Thanks for your support it was appreciatedDr Joe Kiff - User:Lifeartist (talk) 12:07, 11 September 2006 (UTC)

US$0.02: I think making it configurable for different wikis would be great, not only for the above reasons (local admins should know better than anyone else what level of protection makes their site run most efficiently) but also because any standardization will make spammers' lives easier.  In particular, we should assume that bots will eventually know how to solve equations, how to do OCR on un-distorted images of numbers/text, and how to make many edits to the same wiki in seconds (circumventing the "turn off after N edits" and "only apply every N edits" criteria proposed elsewhere).  Maybe if someone has been registered for several months, without ever being banned and with a history of edits to pages other than their user page, they should be exempt (but this criterion should be automated, because on large wikis the rubber-stamping would simply be too much work).  Perhaps there could also be some way to "bind" an IP to a particular registered user (on Doom Wiki, where I hang out, respectable persons sometimes want to make edits from places they can't log in).

I know it's gauche to ask programmers for extra features, but the captcha has unexpectedly been applied to wikis with no real spam problem, and won't by itself put any spammers out of business because there are many wikis besides those hosted here, so I feel like we at least deserve more control.  In fact, as more and more ingenious bots are invented by real editors for constructive uses, the Wikia admins must be willing to maintain the code accordingly.  I also wouldn't know what to tell someone with a severe vision problem (my own eyesight is OK and I still don't get the distorted numbers right 100 percent of the time).    Ryan W 05:28, 13 September 2006 (UTC)

Again, and as stated above, the formulas are just a rough test, and not the final version. Distorted images of text will be used (if the captcha is going to be permanent), as is standard for mediawiki captchas. Also, there is a spam problem, on dozens of inactive wikia and quite a few active ones (but as there are thousands of wikia, some don't notice it). Also (IMHO) having wikia able to self-exempt themselves from captchas will probably just make them more of a target, and will make it harder to find and blacklist spam (especially if it is on wikia that aren't patrolled 24/7). --Splarka (talk) 08:02, 13 September 2006 (UTC)
I never suggested an opt-out option.  On the other hand, you just said that the present problem affects only a small fraction of the wikis here.  Therefore, in all other cases, a mandatory captcha creates the problems described on this page without providing tangible benefits in return.  If, at some future time, a wiki started getting more spam, the local admins could just turn the captcha back on.
If a local admin sees a lot of spam on their wiki and doesn't even try to increase the level of protection, then that admin needs to be educated or replaced, and a crisis would surely accelerate that process (unless the wiki was only created for the sake of that admin's ego, in which case it is probably doomed anyway).  I agree that inactive wikis should be "shepherded" by not making the captcha opt-in.
I guess I can see one advantage of forcing it on everybody: if it was opt-out, the spammers could migrate to another small fraction of active wikis (who would then opt back in), then another, then another... it might take years to get everybody locked down, during which time the blacklist would continue to swell.
Also, what can be done about the workaround mentioned below under "Templates"?    Ryan W 10:07, 13 September 2006 (UTC)

Captcha

Personally, I think this thing is great. Mind you, I've only used it once, being an admin. On another note, why do the numbers in the "fancy" version (see Sparkla's above comment) need be obscured? A bot shouldn't be able to figure them out even if they're just normal image files. Obviously if they're text then a bot would be able to, but as an image the bot can't do squat. Why make it difficult for human users to read the numbers?!?!?!?! ZPMMaker 12:46, 10 September 2006 (UTC)

Why not copy a bit of text from a image. Bots' can't copy text from image and it's easier for someone to copy a few letters from a image than to solve a formula all the time. Dantman (Talk) 13:25, 10 September 2006 (UTC)
I agree, but I still don't see why the numbers need be obscured. It just makes things difficult! I'm not complaining at Sparkla or whoever created this "fancy" version, but at any website in general that obscures the numbers. It's not going to make any difference! ZPMMaker 11:15, 11 September 2006 (UTC)
m:ConfirmEdit_extension. And the text in the images has to be obscured, because it is very simple to make a text recognition program to read these images for almost any known font. --Splarka (talk) 22:59, 11 September 2006 (UTC)
I know most bots aren't JS, but It's easier to write than dom in python for the pywikipedia. But here's just an example to start.
var captcha = document.getElementsByTagName('label')[0];
var eq = captcha.innerHTML;
var nums;
var answer;
if(eq.search(/\+/) != -1) {
     nums = eq.split(/\+/);
     answer = parseInt(nums[0]) + parseInt(nums[1]);
} else if(eq.search(/-/) != -1) {
     nums = eq.split(/-/);
     answer = parseInt(nums[0]) - parseInt(nums[1]);
} else if(eq.search(/\*/) != -1) {
     nums = eq.split(/\*/);
     answer = parseInt(nums[0]) * parseInt(nums[1]);
} else if(eq.search(///) != -1) {
     nums = eq.split(///);
     answer = parseInt(nums[0]) / parseInt(nums[1]);
} else {
     answer = eq;
}
document.getElementById('wpCaptchaWord').input = answer;
document.getElementById('wpSave').submit();
It's poorly written, and needs to be ported to another language, but the dom model is the same, if someone ported it into bot programing then it would take into account for 1 on 1 addition, subtraction, multiplication, division, and also being asked to copy the contents of the box. And that would be part of a bot that would bypass captcha if you don't turn the fancy stuff on. But if you turn a fancy captcha on that asks you to type what's inside the image, just like in signup forms on sites, then a bot couldn't pass that. There's your proof, that code is something anyone with a little bit of js,dom, and bot framwork knowledge could use against captcha. Dantman (Talk) 12:32, 13 September 2006 (UTC)

Hey, I don't mind having the extra Special:Captcha screen pop up when the edit has external links. If it helps stop vandalspam, I am for it. But, when the question popped up, it removed my edit summary without telling me. Can this be changed not to wipe that out. I would hate to solve one problem but then create another. — MrDolomite | Talk 20:49, 10 September 2006 (UTC)

I've had to deal with a couple of times too far and so far I don't really mind it. It hasn't created too many problems for me. Still, it could be a bit of an inconvenience. If it's really set to prevent vandals from spamming, then how about having it setup so that regular users don't have to deal with it? For example, if a user has say, I don't know, more than 100 or so edits at a particular Wiki, then the filter is automatically turned off for them? George B. (talk) 22:42, 10 September 2006 (UTC)
It's possible to create extra flags/groups that can be set by bureaucrats. So why not make a "trusted" group that makes captcha ignore the person, then bureaucrats can give that flag to the regular users on their wiki. Or you could use a mod of the system on meta. They can protect pages so that only users who have registered and existed for more than 4 days can edit that page. Why not mod it so that everyone anonymous and younger than 4 days is affected by captcha. Dantman (Talk) 05:20, 11 September 2006 (UTC)
Yeah, I'm for that. I don't see why it should affect those who are have an established history with a particular Wiki and are already known not to be spammers. George B. (talk) 03:22, 13 September 2006 (UTC)
Willy on Wheels then made accounts he waited five days to use. But nobody remembers now that Wikipedia deleted the article honoring him. Tretonin 05:32, 15 September 2006 (UTC)
I also found it annoying that the edit summaries were being removed. The thing is, admins are not going to know how often this is annoying normal people because they never see it - but it may still be a factor. :-) --GreenReaper(talk) 05:37, 15 September 2006 (UTC)

Other langages

Would it be possible to have an explanation adapted to the langage of each wiki ? For example, in french : = "Protection anti-spam : pour validez votre modification, pourriez-vous écrire les mots qui apparaissent dans cette image" D.Liziard 13:20, 11 September 2006 (UTC)

These messages can be changed by files MediaWiki:Captcha-short, MediaWiki:Captchahelp-text, MediaWiki:Captcha-createaccount-fail, MediaWiki:Captcha-createaccount and MediaWiki:Captchahelp-title, however it can be done only individually on every Wikia and only by admins. Szoferka 13:38, 11 September 2006 (UTC)
Thank you! D.Liziard 14:58, 11 September 2006 (UTC)
Kernigh writes: The message should say to solve the equation, not to write the words in the image. (See my post above, Scott Hanson's post on wikia-l.) The instructions need to be made correct. I tried to fix this (in English) at NetHack Wikia (w:c:NetHack:MediaWiki:Captcha-short, w:c:NetHack:MediaWiki:Captcha-createaccount) and Oberin Wikia (w:c:Oberin:MediaWiki:Captcha-short, w:c:Oberin:MediaWiki:Captcha-createaccount). Well, at least we are annoying those spammers. They deserve it. They should not be filling our Wikia with unrelated links! --Kernigh 16:17, 12 September 2006 (UTC)
Re: "The instructions need to be made correct.", we are switching to images soon (the formula version is just a test), so if you change it, you'll just have to change it back later. --Splarka (talk) 21:55, 12 September 2006 (UTC)

Templates

Easy to circumvent with templates: [1] --85.195.123.29 08:18, 12 September 2006 (UTC)

Works when the template is substed, too. --85.195.119.14 08:19, 12 September 2006 (UTC)
You can also bypass it using <noinclude>http://</noinclude><includeonly>http://</includeonly>en.wikipedia.org. Dantman (Talk) 04:03, 13 September 2006 (UTC)
And that < !-- -- > thing too. -- I need a name 14:49, 16 October 2006 (UTC)

Problems

Doesn't work for me at all. No matter how much I try, I just get the same page with another equation. (I am using anonymouse.org because the machine I'm using is blocked from Wikia for some reason - maybe thats the cause?) Also, it forgets edit summaries. --85.195.119.22 08:23, 12 September 2006 (UTC)

I tried to use anonymouse.org to reproduce the problem. Yes, I am unable to pass the captcha. However, it did remember my summary. --Kernigh 16:33, 12 September 2006 (UTC)
Might be browser-dependent, too, like using the "back" button with POSTDATA.  My summaries are remembered.    Ryan W 05:28, 13 September 2006 (UTC)

when i lost my account i coi tried to create an account this thing caught me i soppose you dont need an account so if it stops pple from filling wicki with garbege im ok

Very discouraging

These are very discouraging for me. I can certainly understand the need for some protections, but if I'm working in a wikia that I'm not a sysop on, it's very frustrating. And on the question above about why we are creating links outside of the site so often, any time we want to reference anything outside of wikia or wikipedia, it's an external link.

I think this is too much, and I think it's going to drive people away. Chadlupkes 17:14, 18 September 2006 (UTC)

he eh, you got that right... The real reason I'm thinking of expanding the gaiapedia into the GaiaMeta Organization and creating some extras to the organization, is so we can do some things that we could never get done on wiki (Special extensions and MediaWiki hacks), but the introduction of captcha was actualy the thing that made me first consider it. It's not entirely captcha's fault but I am still strongly considering it, the only thing stopping me from moving the information into the announcements on our wiki is the fact that I'll first need to get a new job (my last one didn't have enough leftover hours to keep me) and save up so we can start the hosting. Dantman (Talk) 22:12, 18 September 2006 (UTC)

On the discouragement theme

Has anyone noticed a decline in casual edits since the captchas come in. Certainly my impression is that activity on the psychology wiki has taken a hit, but its hard to be sure if its coincidental or not.Dr Joe Kiff - User:Lifeartist (talk) 00:24, 19 September 2006 (UTC)

I never attributed it to captcha... but for a while now our edits have gone from small, to none... the only one who has made any edits other than me was the founder of the wiki we are partners with. Dantman (Talk) 14:24, 19 September 2006 (UTC)

Why not for new users only?

Why doesn't the captcha only effect new users (like semiprotection)? Seems like the obvious way to balance effectiveness and unobtrusiveness to me. --Tgr 21:31, 25 September 2006 (UTC)

Spammers can create new users, and let them camp for as long as they want. After whatever timeperiod has elapsed, they'll have 2000 wikis that user is registered and autoconfirmed on. It would make the capcha much less effective. --Splarka (talk) 22:13, 25 September 2006 (UTC)
Doesn't most spam come from anonymous editors? The wiki I'm on has only had commercial spam from registered users once, months ago. Every other piece of spam seems to come from IPs. —Silly Dan (talk) 22:22, 25 September 2006 (UTC)
A large percentage did prior to the captcha, but spammers can easily adapt.

For the Nay-Sayers, some semisolid proof

Just a note, the Captcha was installed September 8th. Please see the spam blacklist history to see the dramatic dropoff in new additions to the list since then. Also see the talk page history for the dramatic decrease in reports of spam. This is a good thing (tm). --Splarka (talk) 22:29, 25 September 2006 (UTC)

I wasn't doubting that it was effective, I'm just raising the question of whether it is worth the cost. If so, and that's a Wikia Staff decision, great. Those of us stubborn enough to grit our teeth and continue building will get over it.  :) Chadlupkes 20:03, 26 September 2006 (UTC)

Page-blanking

I've noticed this working on external links: it seems to help reduce spam. Is it possible to get it to work for page-blanking as well (so if the software noticed a large decrease in an article size, a captcha would be required)? Or would that be either too difficult to implement or cause too much problems for legitimate editors? —Silly Dan (talk) 01:18, 7 October 2006 (UTC)

There are very few legitimate reasons to blank pages, I'll add it to the lists of things we want captchas applied to. --Splarka (talk) 01:33, 7 October 2006 (UTC)