High Dynamic Range images under Linux

1290

Author: Nathan Willis

Not all image files are created equal. Most of us know this from working with the everyday formats like PNG, JPEG, and TIFF, each of which has its own pros and cons. But cutting-edge applications from cinematography to computer vision demand more range, color depth, and accuracy than these formats can deliver. That demand drove the development of what are called High Dynamic Range file formats. Luckily for us, Linux is a first-class citizen in the HDR image world.

You may have seen the HDR acronym in reference to computer gaming as well. Video card manufacturers use it to refer to rendering scenes with very large contrast ranges. These rendering techniques are not related to HDR imaging and HDR image file formats directly — although, as we will see, game designers make use of HDR image formats to maximize visual quality.

… but some are less equal than others

The trouble with older image formats is that they were designed with output devices in mind. Older formats used a limited number of bits per pixel and had small color palettes, because display devices at the time handled fewer bits per pixel and smaller color palettes. Before digital still and video cameras, few people cared about the color balance of their monitor.

Today, most of our entertainment media are being created and processed on computers. This includes scanned motion picture film, digital video, and computer-generated images in two and three dimensions that must blend in seamlessly with real life.

While we could modify existing image formats by tacking on extra pixels, researchers and industry people opted for a better idea: try to define an image format that can describe the entire scene in at least as much detail as the human eye can resolve — even if it can’t all be displayed on today’s equipment. This concept is known as scene-referred imaging.

To display a scene-referred image on lowly CRT montiors or DVDs, of course, you have to create a mapping function to compress it for each display device, but the technique is archival and you can re-map the original to each destination without a generation’s loss in quality. Even better, working with scene-referred data follows the principle of keeping information around as long as possible, which gives you more flexibility, no clipping, fewer accumulated errors, and smoother undos.

HDR image formats all have a few points in common. First, they encode color information in reference to a device-independent color model such as CIE XYZ or CIE Lab, so they can reference every color the human eye can see (and possibly more), unlike the limited gamut of color models such as sRGB.

Second, they try to preserve the large contrast ratios we experience in the real world. The eye can resolve about four orders of magnitude of contrast at any one moment, but because it adjusts to wide lighting conditions, those four orders of magnitude may fall anywhere within a much larger range (arguably 15 or so, but who’s counting — it’s not easy to test).

Last but not least, HDR image formats try to be of uniform quality across the entire visible spectrum — if it does a smashing job on the highlights but is murky in the shadows, it’s a waste of effort. The eye does not respond to light linearly, and that means finding an encoding scheme that quantizes it usefully but with predictable numerical error. Usually HDR formats pick a logarithmic encoding, but they can be far more complicated. HDR wizard Greg Ward has written an excellent (albeit highly technical) exploration of the topic.

Meeting these conditions is not a trivial task. In addition, you probably also want a format that’s space-efficient, at least a little flexible on things like compression, and that avoids those downright weird situations that often arise in mathematical color models, such as negative colors and hues brighter than pure white, which are hard to explain to the camera-shopping public.

The formats

One of the earliest HDR formats to see widespread use was Kodak‘s Cineon. Cineon is a 10-bits-per-channel logarithmic encoding designed to work with motion picture film. It is supported natively in a number of high-end film scanners and recorders (and by “high-end” we are talking five- and six-digit prices), including a line of devices built by Kodak for digital film restoration.

These days, Cineon has been superseded by the Digital Moving Picture Exchange (DPX) format, which is an SMPTE standard (SMPTE 268M-2003) that extends Cineon in several useful ways. First, while Cineon can store data only in 10-bit log form and Kodak’s reference color space, DPX can use other encodings and bit depths as well. Second, DPX adds several header fields (such as timecode and sampling rate) useful for video processing. Finally, DPX aligns header and data at fixed offsets, allowing file updates without having to read in and write out the entire file.

Though DPX has now officially replaced Cineon, the expensive Cineon-only digitization hardware is still widespread, so both variants are still supported in most HDR software. Kodak provides a test image “target” in Cineon format, if you are curious.

OpenEXR was developed by Industrial Light and Magic and released under a modified BSD license in 2003. It supports 16-bit floating point, 32-bit floating point, and 32-bit integer pixels. It covers more than the entire visible color spectrum, and more than 10 orders of magnitude in brightness. The file format allows for pluggable compression schemes (of which wavelet compression is the most common, but not required) and allows extra, non-color channels like alpha, depth, or other information that may be useful in computer-generated art.

ILM used the EXR format internally for years before releasing it as free software; since the release it has become popular with other special effects and animation studios. NVIDIA and ATI both support EXR data in their graphics cards; it is the 16-bit Half data type in NVIDIA’s Cg language. The OpenEXR download page has precompiled binaries for Windows and OS X, and a hefty package of sample files you can work with on any system.

Other HDR formats include RGBE, which grew out of the Radiance ray-tracer. Since ray-tracing works by calculating all the light in a scene (i.e., not just the few orders-of-magnitude visible on a monitor), they needed a way to model more brightness than a normal RGB scheme allowed.

RGBE uses 8-bit RGB triples plus an 8-bit exponent denoting the magnitude of the pixel’s brightness, extending the contrast range to 76 orders of magnitude — far, far, far more than the human eye can see. I mean, far. The creators of the format later created LogLuv, an extension of the extension-friendly TIFF format, sanely using more of their bits for the color and fewer for the exponent.

In 2004, Adobe released a specification it called Digital Negative (DNG) files. DNG is designed to be a wrapper for various flavors of RAW camera and sensor data, and not for general-purpose image files used in editing — but it is technically an HDR format.

The software

Now to the fun part. On Linux systems, the popular ImageMagick command-line graphics library supports Cineon and DPX files. The ImageMagick offshoot/fork GraphicsMagick, on the other hand, relegates Cineon to “legacy” format status, has a far more complete DPX read/write implementation, supports LogLuv TIFF, and lists OpenEXR on its “to do” list.

Among the GUI image editors, CinePaint handles the most: Cineon, DPX, OpenEXR, and LogLuv TIFF. The GIMP can handle Cineon and DPX through plugins (and the Cineon plugin is more recent than the DPX plugin). Krita supports OpenEXR natively, and inherits the ability to read Cineon and DPX from its ImageMagick dependency.

Raytracers, 3D modelers, and animators in the proprietary software realm can almost always output to OpenEXR, and this is true of Linux offerings like Maya and Shake. Several open source rendering and 3D apps — including Yafray and Blender — either support OpenEXR now or are working on it.

There are a few HDR-specific applications available for Linux as well. exrtools supports (surprise!) OpenEXR images, while pfstools supports OpenEXR and RGBE.

HDRIE and HDRShop are both multi-format HDR image editors, though they don’t seem to be under active development.

The personal site of HDRShop creator Paul Debvec lists some other software titles. And Greg Ward — the man behind Radiance, RGBE, and LogLuv TIFF, has some free utilities on his personal site as well.

Of course, once you have found your software of choice, finding and creating HDR images becomes an issue. Rendering them from a computer-generated scene is straightforward enough, and good digital cameras today capture enough data to make HDR image formats worth using. For film, drum scanning at a professional photo lab squeezes just about every pixel out of the frame, although it will cost you. Alternatively, there are techniques for combining multiple bracketed exposures into a single HDR image.

But remember, even if you aren’t using 32-bits-per-channel originals, storing your data in an HDR image format saves you from roundoff, clipping, and all kinds of computational error that accumulates during the editing process. And what’s a few extra bits between friends, anyway?