I have a page where I can upload an image and php will create 2 down-scaled images. One is for the thumbnail which is max width of 125px and max height of 95px and the other for the 'main' display which is max width of 550px and max height of 600px - the original is kept and it used as the high quality one (as a link). Previously I just had the large image but then users need to download a lot of data they don't necessarily want..
I'll try to post my understanding of what my code is doing and hopefully it's not too inaccurate..
They way I'm resizing is to set the width to x (either 125 or 550) and then scale the height keeping the original ratio - if the heigh exceeds my max I re-do the scaling with height done first.
I'm thinking the imagecreatefromjpeg or the imagecopyresampled is storing the 'big' image in the ram and that is causing my php to run out of memory (smaller res images work fine - less than say 1500px^2).
I have read that I can up this limit in my php.ini but I'm not sure if my host will do this and more importantly I read that I'd be better off fixing my code's efficiancy..
So is there a more efficient way to do what I want without removing any automation? (I don't want to be resizing and uploading 3 images!)
The img*() functions work with a bitmap of the image in memory, which requires 4 bytes per pixel (3 colors plus alpha channel), so the memory needed will be the width * height (in pixels) * 4, and this is regardless of how large the source image file is.
You can adjust max memory at run-time with init_set('memory_limit', <value>). You could use getimagesize() to find the width and height, use the above formula to estimate the amount of memory needed, and the use that with ini_set() plus some additional padding for the rest of the script, the thumbnail images, etc. (You could also define a max size, at which point you could return an error saying the picture needs to be made smaller before processing, should you desire.)
"Please give us a simple answer, so that we don't have to think, because if we think, we might find answers that don't fit the way we want the world to be."
~ Terry Pratchett in Nation
On shared hosts, memory limits are often fixed and cannot be overridden (or at least have a ceiling), so that approach may not work for everyone. I was thinking it would be a good idea to process the image in stages. Scale the original down to your 'medium' size, then immediately store the original and use imagedestroy() to release the memory it was using. Store the medium-size image and then scale it down to your thumbnail.
In the alternative, you could investigate using ImageMagick, which is more efficient than the GD library in using memory. It's easiest if your host has one of the ImageMagick libraries for PHP installed, but for simpler tasks like resizing images, you can just use system() or passthru() to process ImageMagick commands easily enough.