Forum:Bot, php parser and template

Hello!

Does anyone know of any php code/source that could parse a wiki article and extract the template content (and would not be broken if there is a template within a template)?

Ty :) Hunter789 03:01, August 19, 2011 (UTC)


 * Unless you plan to do this for many pages, you don't need a bot for that. You can add ?action=render to the end of a template url to see what the output is.


 * This is in order to extract data from pages (w:c:rappelz:Skell Fighter) and build up some other pages with it (w:c:rappelz:List of Skeleton type mobs), automatically, on a regular basis. Hunter789 03:09, August 19, 2011 (UTC)


 * I don't know where you want that php code to run; it's only possible to run it in a computer to which you have access, not on that wiki or in any wiki in wikia.
 * The raw parameter to the mediawiki software gives you the content of the page. For example, for this page you can use:

http://community.wikia.com/index.php?title=Forum:Bot,_php_parser_and_template&action=raw
 * Usually the thing you are trying to achieve is done the other way around, using some sort of database in a wiki page to hold the information and then using templates to pull that information. I think wikia has Semantic MediaWiki and DPL installed and enabled by default, give them a try.


 * I'm already running a bot written in php on that wiki (cron job every day, once a day). It does a decent job for pages with only one template included (such as this one), but to parse other pages that have more than one, I think my script will go a bit crazy.


 * And yeah, the site was made that way. Not sure I'd put everything in one page and split up after, but I'll look into your suggestions. Thanks! Hunter789 23:47, August 19, 2011 (UTC)


 * Just a note - both SMW and DPL extensions are not enabled by default and have to be requested via Special:Contact. — Sovq 06:09, August 20, 2011 (UTC)