Admin Forum:Losslessly compressing all images on a wiki

Anyone know of an automated way to losslessly compress all images on a wiki? Or, barring that, anyone know of an automated way of downloading every image on a wiki and then re-uploading new versions to the correct fileame? 05:10: Sun 11 Nov 2012
 * Since the wiki servers are run by Wikia, why would you be interested in that? They don't take up any room on your personal storage devices. It'd be a matter for Wikia's administrative staff.
 * Each file format has different features as well, JPEG, for instance, doesn't support lossless compression, if you recompress it, it will be lossily recompressed. ICO doesn't support compression at all. PNG lossless compression is adjustable though. GIFs don't have adjustable compression settings.
 * -- 70.24.186.245 12:09, November 11, 2012 (UTC)


 * Image compression can reduce loading page loading time, something beneficial to probably every wiki especially pages with lots of files on the page itself. Lots of files on a page is generally a bad idea, but sometimes it can't be helped, so we turn to image compression.
 * I don't know the ins and outs of the code used, but runescape wiki has a bot that can compress png and gif files as well as optimise jpeg files, although the latter option is currently disabled. I believe it features a way to stop reuploading unless there has been significant compression which can be adjusted as you feel necessary. It's written in java, with the source available and released under GPLv3, so you can run it if you wish. The bot is operated by User:A proofreader, who can usually be found on runescape's IRC channel,  on freenode, which is probably the quickest way to get hold of them.


 * MediaWiki resamples and reencodes images uploaded on Wikia, and display a different file from that which is uploaded usually, if the file is big (so if you look at the actual files transferred with the webpage, they usually have a pixelsize number attached to them) The files on the webpage transferred are already resized and shrunk. You only get the original upload if you click on the file. -- 70.24.250.26 13:28, November 13, 2012 (UTC)


 * I've previously heard what you're saying, 70.24. But I've done time trials of pages with images >1mb as compared to the same page with the same pics reduced <100kb and the page definitely loads faster, especially on mobile devices. If there's no benefit to starting with a file that has a smaller size, why would the same page be loading with different times? (And, yes, I'm clearing my cache between tests, so I know I'm comparing fresh grab to fresh grab.) 18:40: Wed 14 Nov 2012


 * If you shrank a file from 1MB to 100kB, that's not a lossless compression, you've lost quite a bit of data there. Unless the file was extremely redundant (like a BMP file that's completely white and 10000x10000pixels) you will never get such a compression level losslessly. Since your compression reduced the file complexity, the newly resampled files that MediaWiki creates are themselves less complex and smaller as well. You can test that by zipping the image with maximum zip compression. If that 100kB file is smaller than a zip of the 1MB file, then you know you've lossily compressed something. This is no longer lossless compression you're talking about. -- 70.24.250.26 07:11, November 15, 2012 (UTC)


 * I suppose you can build a script around wget and Special:AllFiles to get all your files from the wiki, then use ImageMagick to batch shrink all your files, and then you'll need to upload them back again... you might be able to expect-script that with a textmode browser.
 * Though if you really want to shrink things, first step would be to reduce resolution of all images to QVGA (320x200) or smaller, then to reduce image colors to 256 or lower (transform to GIF), then to convert all images to JPEGs with a relatively high compresison setting, and then reupload. If you have transparent images, you can't convert to JPEG, so keep it as GIFs, the alpha channel should be converted to dithered transparency instead. This would need a bot to update all file names on pages after the upload.
 * -- 70.24.250.26 07:20, November 15, 2012 (UTC)