CGImage for writing large multi-page tiffs?

Jonathan Taylor

Hi all,

At the moment I am saving tiff files in a one-liner using [[myNSImage TIFFRepresentation] writeToFile:atomically:]. I am considering a change to write rather large multi-page tiff files instead of individual files, and am trying to think through the memory implications. The individual images effectively represent single frames of realtime video, which at the moment are written to individual tiff files.

I have two possible concerns over performance, if I switch to writing multi-page tiffs. The first is that that one-liner is going to be doing a lot of work (assembling the tiff data, and writing to disk) in one chunk, instead of spreading it out more evenly over time. The other is that I am going to be accumulating a lot of image data in memory before writing it. That ~might~ be undesirable from a cache point of view, but more importantly I will need to make sure I don’t risk filling up the 32-bit address space. Yes, unfortunately for driver compatibility reasons I am stuck with a 32-bit build here.

I had a quick look around for alternatives to NSImage. It looks as if the CGImage API is more set up for adding images one-by-one to a CGImageDestination, before eventually calling CGImageDestinationFinalize. It’s very possible, though, that behind the scenes the CGImageDestination is just doing the caching in memory itself, before still doing one big write to disk in Finalize. I couldn’t immediately find any documentation about whether there are any such guarantees about performance, nor could I see any way of encouraging it to flush to disk incrementally. Can anyone offer any advice on that?

The other option would be to look for a third-party library. It looks like libtiff would be an option (which supports incremental flushing), but that looks like it’s a lower-level API than I would ideally need. Does anyone have any other recommendations?

Thanks for any suggestions.