Press Ctrl / CMD + C to copy this to your clipboard.
This post will be reported to the moderators as potential spam to be looked at
We are using the image gen on a few of our websites however the time has come and we need to load balance and move to cloud.
I am using my package (AST Amazon S3) to copy the media files to Amazon S3 once they are uploaded via Umbraco. However we are using the Image Gen to generates the crops.
I am wondering if there is anyway to kind of force image gen to generate the crops on S3 ? or how can I customise image gen to do that?
Your help is much appreciated !
So this is something I'm just about to start looking into. I've found this package:
Which I believe tells Umbraco to use an S3 bucket as a location for both media and examine libraries (I may be wrong about the latter). Anyways, because this is effectively re-routing the media location via the use of virtual folders I think it should work for the Image Gen integration as well.
This would require something more like ImageGen Enterprise, which doesn't exist. At least not yet and not in the near-term as it would require quite a change to the way ImageGen works.
The package Nik mentions looks promising but at a quick glance it only handles requests for static files, which isn't how ImageGen works. Though, if you could trap the request to ImageGen and do a static file lookup from ImageGen's cache and replace the ImageGen call with a request to the static file already in S3 that might just work. Though I can't quite see how that would be done in a performant way at the moment. If a glass of wine this evening offers any illumination I'll share it with you!
And do please keep us posted on what you try, what works (or doesn't), and how you finally resolve the issue. Being such a clever coder I suspect you might just come up with a simple and elegant solution!
Hi Nik and Doug,
First of all thanks for getting back to me and sharing your ideas.
To be honest Doug I was king of thinking the same that there is no easy way to handle this with image gen because of the way it works.
So probably rather than moving to Amazon S3 I would just use DFS (Distributed File System) to replicate the media and the image gen cache files to other web servers.
Bear in mind how ImageGen handles the cache and you won't have any problems.
First of all, ImageGen runs on the webserver so if you have multiple webservers all will need ImageGen since you never know which server will be the one to respond to the visitors request for an image.
Each server with ImageGen running must manage its own cache files. If you attempt to duplicate/replicate/share the 'cached' folders between servers running ImageGen you'll quickly get a corrupt cache and performance problems. If using DFS or similar be sure to exclude any 'cached' folders from replication.
You would certainly want to replicate any media files uploaded to the site across all servers so that any ImageGen request could be handled by any server. Just be sure you're only replicating the original source media files and not the generated cached files.
In terms of optimizations and things to consider:
Thank you so much for your comprehensive explanation and optimisations tips, very very useful !
It seems like we are going to buy the professional version to achieve all of the features you mentioned. (I will get in touch in the next week or so)
So as a conclusion I will probably go for the approach to save all the media files (original one) in S3 by using my package. Then force Image Gen to load the original images from S3 and store the cache locally on the web servers. rather than going for DFS.
However I have got a question about above, so lets say we have 2 web servers and an image called homePageBanner.jpg and lets assume that Image Gen has generated the crops locally on both servers it is in the cache.
My question is when the next request comes does the Image Gen look at the local web servers first to check if the crop has been generated or not and then tries to load it from S3 if the image crop doesn't exist on the server?
A good overall plan that is both simple and effective.
One of the cool features of ImageGen is that it always checks that the source image hasn't changed since the cached version requested was made.
It's a simple size, date, and crc check of the original source file. It takes no time at all for images that can be read directly from the web server's disks. But for images hosted remotely, the source image needs to be retrieved, which can be sluggish due to network latency, etc. Not a massive delay but sometimes noticeable compared to working with local files.
Once the source file is confirmed to be unchanged, ImageGen sends out the previously resized and cached version.
That's why I particularly recommend using client-side caching with ImageGen Professional. For remote images, you'd want to minimize the time it take to fulfill a visitor's request. The best way to do that is to avoid making the request at all! That's where careful use of the client caching time span will be super useful. Granted, it means ImageGen won't check to see if the source image has changed, but if you consider how often that's likely to happen (for some sites this is never going to happen!) you can use very large time periods. Or for some images you can use shorter or very short or no client-side caching at all if that's appropriate. Each 'class' of image can have its own cache time so it's very flexible.
Hope this helps.
is working on a reply...
Write your reply to:
Image will be uploaded when post is submitted