Forum:Running a cleanup bot

See wishes do come true.! One of our users saw my wish list and sent me this message From your "wish list":
 * "Ability to take links out of a number of pages at once. (eg a lot of my pages have 20th century as a link and I would like to unlink them all without going to each one individually.)"

I've just finished writing a script that I can run from my account that will quickly remove all links to, for example, '20th Century'. I have a few questions before I start implementing it at full blast.


 * 1) Am I allowed to do this? Some larger wikis (including Wikipedia) are very strict about letting "bots" run on their site.  Technically, my script is probably a semi-bot because I have to feed it with a list of pages and then tell it to run (so I'm always in front of my computer when it's running).  As it's currently written, it's extremely limited in its scope and there is really no way for it to screw up.  Another argument against bots (besides the possibility of vandalizism/massive abuse/massive mistakes) is that they clutter up the "Recent Changes" page.  That's probably my main concern running it on this site because "Recent Changes" is relatively small and might suffer from extreme clutter and unreadability with 100 or so edits coming from my account.  Fortunately, there is a solution to this &mdash; I can mark all of my automatic edits as "minor".  Then, people viewing the recent changes only have to click "Hide Minor Changes" to filter out my edits. Let me know if this is acceptable.  Until I get your approval, I'll just stick to small scale testing.
 * 2) If this is ok, what other red-links do you want taken out? All the links that point to dates/decades/time periods/etc. to start? Or what about those interwiki language links that appear at the bottom?

An example change that I made using the script:. It doesn't do much, so the potential for screwing up is very low. &mdash; Loudsox

So the question is is it OK to run this programm? Is there a procedure to go through. All help and advice appreciatedDr Joe Kiff - User:Lifeartist (talk ) 20:34, 25 August 2006 (UTC)
 * Actually Loudsox went ahead with a trial run and it seemed to work OK. However I was alarmed that this could be done without permission. Somebody malicious could do a lot of damage in a very short time. Can these bots be stopped from running without clearance? What procedures do we have for repairing edits made if they run into hundreds?Dr Joe Kiff - User:Lifeartist (talk ) 23:20, 25 August 2006 (UTC)
 * Bot's are simply ways of automaticly doing what a user could do. It's just that they are programed to do a repetative task faster than a normal user could do. So yes anyone who could create one could easily use it.
 * To stop a bot like that all you half to do is block the user that the bot is running on.
 * If a bot does massive damage to your wikia, just do a rollback on the edits made by the user that the bot runs on.
 * I'm actualy thinking of simiar kind of bot for my wiki (A user controled one). To go with one of my programing language knowledge sets, I decided on a firefox extension that could perform regex replaces on pages. That's probably my most effective idea. I intend to make it do some simple things like fixing up date links (On our wiki Months, and Years are made as links). Though, I intend to create a second account and set it as a bot account when I do finish the tool. dantman 01:40, 26 August 2006 (UTC)