User blog comment:Daniel Baran/Search Developments: Big Picture/@comment-1649337-20120510113731/@comment-4674838-20120511204057

To clarify the problem I was visualizing:

A regex is a pattern. For example, suppose you want to redirect all pages that match the regex (Power)?[ ]?(Station|station|Stations|stations|Plant|plant|Plants|plants) to the page titled "Power Plant".
 * For example, the page "Plant" matches that regex, so a new page called "Plant" is automatically created (say, by User:WikiaBot) with the content  #REDIRECT Power Plant .
 * Thought: If the page "Plant" already exists, what happens? Is it left alone? Is it edited and the contents overriden to make it a redirect?

Now suppose that you also decide to redirect all pages that match the regex (Flowers|Flower|Weeds|Weed|Plant|Flora) to the page titled "Plants"
 * For example, the page "Plant" also matches this regex, so you have now instructed it to redirect to both "Power Plant" and "Plants". So what happens now? Does it redirect to the former or the latter? Is a disambiguation for "Plant" automatically created?

Regex dangers
 * Suppose that, with good intentions, you decide to redirect all pages that match the regex Time.+ to "Time". You've now unwittingly given instructions to create trillions of pages and redirect all of them to "Time". Once you realize your mistake... how do you kill the process and who's going to go back and delete everything?
 * This very sort of example is why Special:AbuseFilter has rate caps. Because regexes are tricky and it's very easy for someone who doesn't know what they're doing (and even some who do) to make a huge mistake.