For those who shoot video on a regular basis, high contrast lighting situations generally provide a challenge, as you have to make a trade-off between correctly exposing for the shadows or the highlights in the frame. Thanks to HDR video techniques you can have the best of both.

In an earlier post here on back in September, fellow-author Christophor Rick talked about HDR for video. This article builds up on that, albeit from a videographer's perspective.

HDR Video - or High Dynamic Range - already is quite common ground in stills photography. Thanks to the bracketing-function on most any DSLR (read my DSLR how-to guide here), you can instruct your camera to take sequential exposures at several stops below and above what is generally regarded as correct exposure. After that, you can combine these images inside Photoshop to a HDR new image that is capable of showing both detail in the shadows and the highlights of the image and have them correctly exposed.

Exploring HDR Video   The Real Future of 3D Video new dynamic range ex1a Exploring HDR Video   The Real Future of 3D Video new dynamic range ex1b Exploring HDR Video   The Real Future of 3D Video new dynamic range ex2b

On the left you can see that the image was underexposed. The image in the middle does the opposite - blown out skies, but better exposure for the water. By combining these two images inside Photoshop (using a process of tonal mapping), you can create an HDR image (on the right) which exposes correctly for both of the high contrast situations.

Unlike our eyes, cameras have a very limited contrast-ratio, typically anywhere between 30:1 or 50:1. Contrast that with the human eye which is capable of handling ratios that vary between 400:1 to 10,000:1 and you'll quickly understand why cameras cannot record what we see in real-life. With HDR and 32-bit float technology, we are capable of upping the game.

From Stills to Motion: HDR Timelapsing

One can easily imagine the above procedure working for several of these HDR photos in a sequence, more commonly known as a timelapse. By instructing your DSLR to take a bracketed exposed picture at an interval basis (say every 10 seconds) you can create something like the video below (please select 720p playback):

Although the effect is beautiful, timelapses are generally used for special effect purposes in a typical video production. For example, take those fast flowing skies over the New York skyline in most any episode of The Apprentice, inside the video above or these amazing TimeScapes from Tom Lowe. As timelapses span longer periods of time, if you include people, they move at high speeds. This can be a desired effect. I frequently use it inside my own productions as build-up or transitioning shots. However, timelapses (even HDR ones) cannot be used as a technique with live talent in realtime. The contrasty lighting situations, however, do occur from time to time, so all we need is to find a solution to this problem.

Caveat: True versus faux HDR

Before we move on to the possible solutions, one important note must be made. Most video cameras are not capable of recording RAW footage (yet). Currently, only a few cameras support RAW recording, including RED One/Epic and ARRI Alexa. Both of these companies are said to hold a race for getting a patent on an in-camera solution of recording well beyond 12 stops of latitude in a single take. RED states that using the new HDRx technology, the latitude can go up 18 stops, while ARRI mentions a mere 13 stops of latitude. Look at this clip released by RED - although it appears dull at first, that very flatness contains tons of data that can be manipulated in post production. Data that would normally be lost due to either too much contrast/sharpness in the picture and/or high compression.

It is widely expected, however, that Canon will at some point in the future unleash a clean HDMI output for their current or at least their next breed of DSLRs (hopefully native RAW recording) — this is, after all, the number one request from users. When connected to an external recorder such as the Ki Pro, nanoFlash or Ninja, even the much anticipated Panasonic AF100/101/102 (depending on where you are on this planet) can record at much higher datarates and with less compression, thus preserving more of the fine detail.

Therefore, until we have RAW recording more widely available, the solutions below may be described as faux HDR. When compared to the world of stills photography, faux HDR is the same as creating a HDR image from two JPEGs, instead of RAW files (true HDR). RAW files, of course, record so much more data of the light captured by the sensor than what remains after it has been translated into a compressed JPEG (8-bit) file format, as Andrew Kramer shows in this tutorial.

Regardless of whether it is true HDR or faux HDR we are talking about here, the general idea is to significantly increase the latitude of the images recorded by correctly exposing for the highlights and the shadows. Instead of realising that in one take (as with the RED or the ARRI), we have to resort to increasing the overall latitude by mixing multiple recorded takes: essentially, by making a combination of under and overexposed areas. Much like we do with creating HDR stills.

There are two methods to this end: either sequentially or simultaneously.

Sequential Recording: Motion Camera Control

Basically, this technique involves the shooting of a backplate and live action take sequentially.

For example, an actor which is positioned directly opposite the sun or against a window. Common camera sense would advice against such high contrast situations (e.g. never shoot against an opposite light source — a lesson I learned before changing my career towards camerawork), but sometimes there is no other choice, or it is a deliberate creative choice. Under normal circumstances the above mentioned trade-off comes into play. However, with motion camera control you can sequentially correctly expose for both the lighter and the darker areas of the frame. Motion control can be achieved by either locking off the camera or repeating the same move.

Typically, you would lock off the camera on a tripod: make sure you lock off the camera completely, which probably means using a remote start-stop device to control the record button. Then, first expose for the background (the window) and record for the same period of time as the live action takes place. Next, you redo the scene, but now with the actor in place and you expose correctly for his face. As a result the window will most likely blow out (a white blob). In post production, you can then combine this 'backplate'-take and the sequential 'action'-take into a single HDR video. As far as I know, there is no software or plugin available to do this easily, but through a combination of rotoscoping and/or tonal mapping, I would imagine this to work (read on until you see the video at the bottom of this article).

ALSO ►  Kodak PIXPRO SP360 Action Camera: The Reel Review

However, locking off the camera typically results in quite static shots. In order to make your shots more dynamic you could use a pan/tilt/dolly move. But when doing things sequentially, requires you to have identical, repetitive camera motion. Although there are very professional solutions available, nowadays there are also more affordable ones. An example of this are the motors and controllers made by Kessler Cranes. Although these were originally designed for their PocketDolly and CineSlider products, a more recent addition - the ShuttlePod - allows for travel over longer distances.

The beauty is that Kessler Cranes also supplies an Oracle controller that allows for repetitive motion, meaning that you can repeat the same camera move time and again. In the second part of the video, you can see how Tom Guilmette superimposed himself at multiple takes of the same camera move. It is not hard to understand how this technique can be used for sequential HDR video recording as well.

The upside of this solution is that you can do this with a single camera unit. The downside of the sequential approach, of course, is that lighting situations change, especially when shooting outdoors. Think of the sun moving behind a cloud, or the speed at which the sun sets or rises. That's when you need to look out for the other solution. Here's another video from, who uses a similar motorised camera control system:

Simultaneous Recording: 3D Beam-Splitting Technology

Currently, there are no cameras available yet that have built-in beam-splitting technology, although I have been told that ARRI holds a patent on some technology to that extend. Basically, recording HDR simultaneously means using two identical cameras/lenses.

Although I am not a believer in the recent surge of 3DTV technology (I consider it more push than pull, all supply and no demand), I can totally imagine 3D technology taking off for HDR video purposes. The acquisition methods of beam-splitting cannot only be used for creating images suited for 3D, but also help solve a real-world problem in 2D — being that of high-contrast lighting situations.

In 3D there are two methods of recording - either by placing the cameras side-by-side, or by using a mirrored solution of two cameras being placed in a 90 degree angle. Both videos below from 3D Film Factory explain the differences of these two techniques:

Placing the cameras side-by-side provides a desired effect for 3D stereoscopic movie recording. Much akin to how are eyes work, the side-by-side placement creates the illusion of depth. However, for HDR video purposes, this set-up tends to create more problems than solve them, as you get a parallax effect which cannot be easily solved in post production. Parallax is an effect that you might want to strive for, as this camera set-up of 52 DSLRs so nicely shows, but when it comes to HDR prevention typically always is better than the cure. So the beam-splitting solution is the way forward for true HDR video recording, as evidenced by the videos below, from E3D Creative and Soviet Montage:

The upside to beam-splitting is that you record an identical image at the same time, meaning that there will be no differences in lighting situations or time. In order to create an HDR video, all that needs to be done is expose the cameras differently. In our example, one for the background, the other for the actor. Subsequently, you combine them in post production (as explained before).

The downside clearly is that you need both a beam-splitting rig and two sets of identical cameras/lenses in order for it to work. This instantly triples the gear needed for a production, but thanks to the DSLR revolution, gear is becoming more and more affordable. Nonetheless, for now this approach easily involves $10k worth of equipment.

Conclusion: 3D's real future is HDR Video!

HDR video definitely has a future, as it solves a real-world everyday problem for videographers and cinematographers alike.

Unlike 3D, I like to add. In September I visited the International Broadcast Convention (IBC) in Amsterdam (more coverage on that later) and everyone and their neighbor had solutions for a market that is nacent at best. Don't get me wrong, I believe 3D is fantastic for feature length movies inside theaters — we have all seen Avatar. Epic. However, I just don't see me and my family huddling around the TV set each wearing stereoscopic glasses. Not now, nor at any point in the future. I just don't see it happening. Now I know that some manufacturers are developing solutions without requiring you to wear 3D glasses, but these solutions are far from perfect. Above and beyond all this, I just bought a new FullHD TV set which I expect to last for at least 5-8 years. And I think I am not alone. Look up the statistics and you can easily predict that 3DTV has a long and winding road ahead. All supply, virtually no demand.

Nonetheless, as explained in this article, the technology behind 3D, however, can be put to excellent use for creating moving images with a HDR look an feel to it. Agree or disagree? Please feel free to comment below.

Want to see more HDR videos? Feel free to watch/join/add to this dedicated HDR Video group on Vimeo.

  • mckinneyite21

    I think the part I need the most help with is with the manipulations during the post production to make it not so dull. I appreciate you sharing so much of your 3d production advice with us (ex.  ). I'm looking forward to trying some of it out this week. If I do get the parallax effect what should I do? 

  • mckinneyite21

    I think the part I need the most help with is with the manipulations during the post production to make it not so dull. I appreciate you sharing so much of your <a href="">3d production</a> advice with us. I'm looking forward to trying some of it out this week. If I do get the parallax effect what should I do? 

  • Alejandro Ruizdelacuesta

    I do not know, Richard, there is a natural want (pull) for realism, and 3D provides the closest there is for now. The ability to see depth is unmatched by the HDR techniques, as beautiful as the images it creates are. I have been experimenting with the Fujifilm 3D camera, and the results are truly astounding when seen in large 3D TV. The ability of making objects "come out of the screen" and almost reaching you really downplays the ridicule of having to watch each other on the funny 3D glasses. Mind you, that the new 3D glasses are very light and come in a variety of colors. Add fashion to the 3D world...

  • WJM

    PS: I focused on HDR-art video instead of regular HDR video (and refered to HDR-art stills on previous models), because only that is mentioned with the latest ZR15 & ZR200, no HDR video (although it obviously could/should be present (the manual does refer to different settings/strength for HDR-art, but assuming this variation is identical to Tryx & ZR100, then it only offers settings 1-2-3, not 0 for ordinary/non-art HDR (HDR-art & HDR are 2 completely different menu-topics)).

    Btw, my guess about the why: Casio has been (one of?) the leader(s) in high-speed video....and only that kind of frame rate & processor capacity/speed would allow HDR video, requiring at least 2-3x more frames for the same shot.
    (though the manual doesn't mention any resolution restrictions, like is common/default with high-speed video....but I wouldn't hold my breath on seeing 1080p & 50fps HDR video soon....;))
    (oh well, maybe they skip that step, and go to 4k HDR-video right away....:))

  • WJM

    Casio ZR15 (released October) and ZR200 (released 18 November) both offer HDR-art video.
    Previous models (Tryx, ZR100) already did HDR-art stills.
    (and unless you have seen the lovely effects from night-light scenery, I would not dismiss's the same wonderful surprising stuff we liked so much with analog film, never really knowing for sure what the end-result would be....:))

    Of course, what is lacking now, after I fell in love with the 21mm panorama's of the Tryx, is HDR-panorama's (should require only a software-update!).
    (if not, someone ought to hack the darn thing!....:))

  • Richard van den Boogaard

    @Sean: that's an HDR timelapse, not a HDR Video. Please read the article to know the difference...

  • Sean

    What about that one:

  • Alejandro León Márquez

    ta interesante

    • Alejandro León Márquez

      @[635228716:2048:Orlando A. Adriani V.]

    • Alejandro León Márquez

      men ese es el articulo que te comente en tu perfil velo pa que tripees.!!!!

  • Patricio Rodriguez

    Great article!

    Thanx for the valuable intel on this one... i've been looking everywhere for this tips.
    Im gonna try some HDR video with two 60d's... hope it works fine.

    Keep it up!

    • Richard vd Boogaard

      You're wellcome. Please share your trials on the Vimeo HDR Video group!

  • Leandro Lefa

    I agree. HDR is a need, not like 3D.
    Have you read what editor Walter Murch wrote about 3D? You might find it interesting.

    • Richard vd Boogaard

      Walter is totally right. I think 3D is a gimmick; it will not last...

      Last night, I watched The American ( from photographer-turned-director Anton Corbijn. What a relief this movie was compared to all the fast-paced cutting we see everywhere these days. Movies are stories being told and they should suck you right in; 3D is a technical flip-book gimmick that prevends you from getting into the story but just stare at some Fx (with a complimentary headache).

  • ano
    The first real HDR MUSIC VIDEO!

  • optikronix

    Hi Richard,
    i noticed how the dslr rig in the soviet montage video sets each camera to -/+ 2 exposure...
    with the canon 5D2, was this just a matter of changing the aperture and shutter speed on each camera? cos if doing so wouldnt that also mean focusing on different distances even if its the same lens on both cameras?
    or am i missing another camera function to set exposure altogether?

    • Richard van den Boogaard

      There are a number of ways to control your exposure. I know that you don't want to touch the shutter speed - you have to adhere to the 180 degree rule in order to keep motion fluid. If you go above 1/50th (at 24/25 fps) you will get staccato movement; below 1/50th you will get motion blurring (for 50/60 fps you need either 1/100th or 1/125th)

      Furthermore I don't think you want to change the focal plane too much by changing the aperture; if you do, you'll end up with mixes of blurred sections in the image which you then have to blend together in post.

      This probably leaves two options of controlling exposure - placing an ND filter in front of the lens or using the ISO levels. With a VariND from SinghRay or FaderND from Lightcraft you can vary the incoming light from 2 to 8 stops. Using ISO for this purposes is more likely to increase noise levels in the image (unless you have a 1D mark IV).

      • optikronix

        Hey thanks, just curious how to go about it cos I'm setting this as a creative project for a class. Thinking of buildin my own rig for this....

  • Christophor Rick

    ARGH! it's christophOr! Now I can be offended like Grant was that one time I got his surname wrong ;)

    • Richard van den Boogaard

      My bad. No need to be offended any more - I just corrected it.

      • Christophor Rick

        No worries, it happens so much, especially in my business emails...I just don't usually say anything ;)

  • lorrainegrula

    Hello Richard and Mark.Thanks for a fantastic and informative post. You do such a great job here at Reel SEO. I'd like to add that the basic technique you described here of composting two or more versions of the same image with different qualities and properties in post production can create all kinds of interesting effects. For example, a soft focus from a blur filter combined with the original clip in sharp focus can give a great "misty" effect. Kind of hard to describe, but sure looks nice. Makes it "glow" a bit. You can do the same thing with colorization. Like having a black and white background but the subject in full color. Since this is all advanced editing, if someone does not want to bother combining shots, or if they do not have an edit program that lets them, adding front light to a back lighting situation will also solve the problem Richard describes in the article. Then, your subject will be well lit and the background too, all without having to resort to advanced editing techniques. Cheap and easy! I love cheap and easy!Lorraine Grula

    • Richard van den Boogaard

      Hi Lorraine,

      Thanks for your comment. Indeed, you can do multiple wonderful things when compositing clips.

      True, you can use a fill light for an underexposed area in the frame — to some extend. Details on neither end of the spectrum (highlights nor shadows) get recorded very well on a regular camera, unless you shoot RAW, and cannot be done cheaply at the moment (using a RED Epic or ARRI Alexa).

  • jim louderback

    We recently shot Diggnation in HDR. It came out really well - check it out here -

    jim louderback, ceo revision3

    • Mark Robertson

      just watched it - that is great Jim.

    • Richard van den Boogaard

      Great footage. Too bad the guys from SovietMontage are really anxious of keeping their rig a big secret.

      • Richard – The Camera Dolly Guy

        Well the guys at SovietMontage really have some fantastic footage.

        • Richard van den Boogaard

          True, but why does everyone seem to think that this HDR video production thing should be kept a big secret? IMHO it solves a genuine problem, so let's all benefit from that new insight.

        • Richard van den Boogaard

          From the blurred out section in the Episode we can definitely tell that this more likely is a side-by-side rig or a single lens attached to a T-shaped beam-splitter.

  • Andi

    RED has lately announced HDR functionality in the new Epic and Scarlet cameras, using RAW technology to enable 18 f-stops of latitude without need for multiple exposure.

    • Richard van den Boogaard

      Of course, what RED does is the ultimate future. Makes you wonder how they are capable of capturing at multiple exposures simultaneously in camera. I believe they call it EasyHDR™- done in camera with "Magic Motion".

      However, RED and even more so the ARRI Alexa come with price tags that are only affordable for those in professional cinema. The EPIC will be available at $24k and the Alexa starts at $45k.

      I have updated this article by adding a paragraph on true versus faux HDR. Faux HDR is what we are currently forced to do, as so little cameras support RAW recording. I expect that in the future more cameras will enable this, or at least a clean output which can be recorded on an external device, like the Panasonic AF100 already does.

      In order to make true HDR, we will need to resort to multiple takes. Thanks to 3D rigs, we can.