Thread:KockaAdmiralac/@comment-9605025-20191028042947/@comment-27345308-20191103123027

If you make your script do about 40 asynchronous requests at a time (that is, the amount of requests currently being made is 40 at any point in time) you can pull off 100-150 HTTP requests per second (assuming you have a good enough Internet connection) so for 44k revisions it should finish in less than an hour. At least that's my experience from running a cross-wiki spamfinder which made an HTTP request for each page in article namespace on all wikis (though the script's bandwidth was probably higher than average in most countries so a few hours is also possible, but probably worth waiting out if you do decide to fetch the revisions via API).

If you're using Node.js... it wouldn't be a bad idea. If you intend to do the processing using JavaScript that is executed inside your browser, it probably would.