Board Thread:Support Requests - Getting Technical/@comment-3271569-20130428132942/@comment-303594-20130428135335

Yes, that will certainly break. You're literally running a horribly expensive program every time that page is cached. MediaWiki realises it's too expensive and isn't going to terminate in a long time, and so kills it and returns Node-count limit exceeded.

There's no easy way around this. You might want to make a python script that takes those inputs and outputs a table in a text file that you can copy in - as this isn't likely to change any time soon, or just provide much less information.