The cinema5D Camera Lab is Back – Dynamic Range Tests

October 22nd, 2018
The cinema5D Camera Lab is Back - Dynamic Range Tests

As many of you are well aware, the cinema5D lab was built up a few years ago, and huge efforts were put into detailed camera tests concerning dynamic range, sharpness, rolling shutter etc …

No new measurements were done in a while, but now we decided to start again, in a step by step manner, building on this foundation.

Therefore, as a first activity we are bringing back the dynamic range measurements. This article is intended to describe the workflow and also philosophy of our test and should serve as a reference for those readers who want to engage / contribute to our method and findings.

Dynamic Range Measurements – the Conundrum

To the best of the author’s knowledge there is no industry standard available for movie cameras, and even the most simplistic definitions are vague as you will see quickly (there is an ISO 15739:2003 standard for digital still picture cameras though from which we will borrow some of our terms and definitions).

That has left a lot of room for interpretation and also allows each camera manufacturer to publish numbers which are not always directly comparable between the cameras.

Because dynamic range of a camera is an important feature our target here is to establish an independent base reference benchmark which allows direct comparison between the various cameras on the market.

What is the Dynamic Range of a Camera?

The dynamic range of a camera can be simplistically defined as the ratio of the maximum and minimum luminance that a camera can capture.

Sounds good, but how to quantify the term “capture”?

The ability of a camera to capture highlight detail is quite obvious, but the problem starts in the shadow details – here, noise starts to kick in quickly and we are left with room for interpretation which kind of noise is still usable or not – at this point, let’s talk quickly about sensor noise and signal to noise ratio (SNR).

Noise is the random reported variation of luminance of pixels compared to their actual luminance value.

Therefore, below a certain base luminance the image may be so corrupted by noise that we cannot really say that detail in the shadows is captured accurately or that image detail is usable.

The ISO 15739:2003 standard is helpful here as it defines a signal to noise ratio of 1 as being the threshold value for dynamic range measurements. However experience shows that for movie cameras a signal to noise ratio of 2 (or a root mean square value of noise RMS = 0.5 = 1/SNR) is a more conservative approach leading to “usable” shadow detail.

Lucky for us the ARRI ALEXA factory dynamic range statement of 14 stops seems to correspond very well to the 0.5 = 1/ SNR equation as past cinema5d results showed, hence we will continue to use this threshold value to determine the dynamic range of different cameras.

The Testing Procedure

We are using a DSC labs Xyla 21 backlit transmissive chart in a completely dark room, which provides 21 stops of dynamic range.

This chart is filmed offcenter from the horizontal axis (to avoid lens flare) using the following methodology: 

At the highlight (left) end we hard clip the first two patches, then stop down until the second patch is on the cusp of clipping. 

Now we have 2 methods to identify the dynamic range:

a) a visual inspection of a waveform plot and the recorded patch image

b) IMATEST: we export frames from the video file and run it through the IMATEST software which calculates a numerical value for the dynamic range using advanced image analysis algorithms (see http://www.imatest.com/docs/stepchart/).

While transmissive tests are the simplest/most accurate/comparable tests of dynamic range, they also give us very little information regarding how the camera responds to colors and detail across that range. Hence, it is just one piece of information out of many other factors when making a judgement about a camera.

Visual Inspection and the IMATEST Software Results

As the first patch is clipped, but the second is still within range, we start to count visually from this second patch downwards, hence from patch two to three is your first stop, from 3 to 4 is your second stop, and so on. So far so good.

At the shadow end, a visual inspection is more difficult as we have to count until the last discernable stop sticking out from the noise floor.

See figure 2 below from the Sony FS7 (SLOG3 10bit internal, values scaled to 8bit), which I tested again among other cameras to ensure consistency with past measurements.

The noise is (visually) indicated by the red line, and if we count all clearly discernible stops from the noise floor we get about 12 stops, maybe a bit more.

Fig. 2: Sony FS7 SLOG3 step chart waveform plot (from Premiere Pro). Cleary discernible is the noise floor (indicated by the red line). Every patch clearly sticking out from the noise floor is a „usable“ stop, giving us about 12 stops range.

Now there is of course some ambiguity in the visual method (where exactly is the noise floor?), and here is where the IMATEST software comes in very handy.

Our Workflow using IMATEST:

Using the FFMPEG library we extract the highest quality I-frames directly from the video file, using this command (thereby avoiding any influence of an editing system): 

ffmpeg -i videofile.mov -vf „select=eq(pict_type\,I)“ -vsync vfr framegrab%04d.tiff 

Those high quality tiff framegrabs are imported into IMATEST, and the region of interest is selected:

Fig. 3: patch image from the video file, selection of region of interest in IMATEST

IMATEST will then analyse each patch via advanced algorithms to calculate signal to noise ratios and then spits out the following chart (among a lot of other stuff) –  the dynamic range for various SNR values:

Fig. 4: IMATEST Result plot for the Sony FS7

For the Sony FS7 IMATEST spits out a dynamic range value of 12.1 stops for 1/SNR = 0.5, and 13 stops for a SNR value of 1. Also, the software identified 17.7 discernible patches.

Hence, the Cinema5d dynamic range value for the Sony FS7 is 12.1 in UHD (3840×2160).

Image Downscaling

One word about image scaling: if you scale the Sony FS7 image from UHD to full HD (1920×1080), you get an increased dynamic range in IMATEST of about 0.3 stops in the Sony FS7 example (resulting in 12.4 stops), as the downscaling averages 4 UHD pixels into 1 FHD pixel thus lowering the noise. Another argument for 4k acquisition and 2k delivery.

What happens by downscaling and how is the dynamic range increased?

Well, the signal value of four neighboring pixels has a high degree of correlation (luminance will be roughly the same), whereas noise for those pixels in general should show no correlation as it is random. 

Now, by downscaling (or averaging) four pixels of an 3840×2160 image to 1 pixel of an 1920×1080 image the signal value should stay very much the same, but the noise value will reduce because noise is random (uncorrelated), hence averaging (or in mathematical terms the root mean square value of noise which is measured in the recorded image) scales with the inverse square root of the number of samples that we average.

Hence 4 pixels into 1 gives a factor of two (sqrt(4)=2) higher signal to noise (SNR) ratio. Higher signal to noise ratio for a given luminance = higher dynamic range.  

Scaling for our testing is also done using the ffmpeg library (libswscale) to avoid any influence of the editing software.

The nice part about IMATEST is the fact, that all analyses are based on pure mathematics and not subjective opinions. Also, as long as 1 patch is clearly clipped you can underexpose / overexpose the stepchart and IMATEST will still give you the same result within a few percent difference. I did a lot of tests recently with the software and retested the Sony FS7, the Sony A7sII, the Blackmagic Pocket Cinema Camera, the GH5, GH5s and the FUJI X-T2 / X-T3 to make sure I can reproduce all past results – a very interesting experience, as it gives a lot of insight on the capabilities of the different sensors, and also manufacturer’s claims.

So here is a table of those recent results for your convenience (the ARRI ALEXA values are from past testing, but reproduced here as a reference too):

Fig. 5: table of recent testing results (exception: ALEXA values for reference from past cinema5d  testing). The value in bold on the right side of the table is the cinema5d dynamic range rating.

             

Manufacturers will probably use the less strict ISO15739:2003 SNR = 1 definition (signal value equals noise) as it gives a higher dynamic range reading. As can be seen from the results above, for this definition in the highest native resolution the Panasonic GH5s would be a 12 stop camera, the Blackmagic Pocket Cinema Camera a 12.5 stop camera, etc… 

One more thing: in the beginning of the article, the definition of the dynamic range was given as the ratio of maximum to minimum luminance the camera can capture – IMATEST will also spit out a “patch range” value which is the maximum number of patches that IMATEST was able to discern (not shown in figure 5). Here, for example the GH5s value was about 14.7 detected patches, and the Sony FS7 value is 17.7 (see figure 4). 

Hence, whenever a dynamic range reading of a camera is stated be aware that it only makes sense if also a clearly stated noise threshold value for this dynamic range result is given.

Last but not least, never forget that also the objective results of the IMATEST software will just be a relative value. Some cameras with advanced codecs allow you to still retrieve image detail from the noise floor with advanced post processing, while others don’t. So don’t be discouraged by a better / or lower dynamic range value of this or that camera, at the end it is still the guy behind the camera who makes the film ;-)

As a summary, we hope that you will find our results helpful, and we are looking forward to include dynamic range measurements again for future camera reviews! What do you think? Let us know in the comments below.

60
Leave a reply

guest
1
1
- 3
Filter:
all
Sort by:
latest
antoine amanieux
antoine amanieux
Guest
October 25th, 2019

do you have a page where you have DR of all the cameras you tested ? thanks

Dave
Dave
Guest
November 10th, 2018

Not a video expert, just a hobbyist. Here’s where I question your test results and methodology. Your entire test depends on software to calculate the signal to noise ratio, but that itself depends on what the camera is doing or not doing to denoise the image internally. So if camera A has superior denoising processing but camera B has no denoising, your results will not reflect the true sensor’s dynamic range. So this is why I don’t think your results are 100% useful or true. Also if one does use a denoiser in post, you can recover some dynamic range which the software rejects. So your dynamic range figure for a sensor, IMO, is flawed.

Steve
Steve
Guest
October 30th, 2018

Please do the Eos R

Adam
Adam
Guest
October 29th, 2018

Could You add FS5II, A7III and A6500?

Member
October 25th, 2018

Interested in seeing how the Red Monstro 17+ stop stated Dynamic Range turns out in your tests. Thank you for bringing this section back Gunther! Since Canon (C-Log) w/ EOS R and Nikon (N-Log) with the Z6 state on their product pages that 12-stops can be resolved, I’d be curious to see how that compares to the range of the respective cameras RAW still shooting and therefor how much the logarithmic profile impacts the potential of a sensor. Blackmagic as well stated in an email exchange that the 15 stops in RAW of the 4.6K URSA Mini is reduced down to 12-13 stops while shooting with the Blackmagic Film log profile.

Member
October 25th, 2018

Would like to see results of the followings added soon: Sony A7III SLOG2/3, Blackmagic Pocket Cinema 4K, Canon EOS R C-log. Can you please test the new “hot” cameras? Would like to see objetive results.

Robert Niessner
Robert Niessner
Guest
October 24th, 2018

Hi Gunther!
Ich könnte euch mit ein paar weiteren Kameras (auch älteren Modellen) zum Testen versorgen, falls Interesse besteht – wir müssten halt einen Termin ausmachen, an dem ich von Graz nach Wien käme.

BMCC, UM46K, UMP, PCC4K, Canon XF305, Canon XF405, Canon XH-A1 inkl. BMD A/D Wandler, Sony EX3, Canon EOS R

Tim
Tim
Guest
October 23rd, 2018

Dynamic range is so important. However, what’s more noticeable is how a camera handles clipped highlights. A camera with a smooth highlight roll-off can often look so much better than a camera with greater dynamic range but a hard clip.
Any way to measure this concept?

 the SUBVERSIVE
the SUBVERSIVE
Member
October 23rd, 2018

Nice article but since this is a page that deals with images I have to ask, why the pixalated images?

John
John
Guest
October 23rd, 2018

Thank you for doing these tests! In previous articles, you have put the A7sII at 12 stops of dynamic range (https://www.cined.com/fujifilm-xt2-vs-sony-a7s-ii-best-mirrorless-video-camera/) for Slog3 in UHD. Is there a reason it is just 10.6 in this chart?

John
John
Guest
October 23rd, 2018
Reply to  Gunther Machu

Thanks for checking! That makes sense. I’ve had discussions with people who think your tests are bogus because the A7sII is listed alternately as 12 or 10.6, with no mention of downscaling until now.

Member
October 23rd, 2018

Happy Xyla chart is back, thanks.
Surprised X-T3 would have lower dr than X-T2. Really?

Lukedriftwood
Lukedriftwood
Guest
October 25th, 2018
Reply to  Gunther Machu

For the test, did you turn off in-camera noise reduction? X-T2 has default “0” that cannot be changed in video mode, X-T3 can be changed to “-4”.

Johnnie Behiri
Admin
October 25th, 2018
Reply to  Lukedriftwood

Hi Lukedriftwood.
X-T3 was set to noise reduction -4.

Thank you.
Johnnie

Lukedriftwood
Lukedriftwood
Guest
October 26th, 2018
Reply to  Johnnie Behiri

Thanks Johnnie, that explains the higher DR on X-T2, since it applies NR that cannot be turned off in video mode.

Dennis Schmitz
Guest
October 29th, 2018
Reply to  Johnnie Behiri

If you shoot something intended for narrative projects you better than off all the internal processing – or at least as far as possible.

Well done!

Robert A Cuff
Robert A Cuff
Guest
October 23rd, 2018

So in thoery to get the best dr. its best to capture footage in 4k and downscale in post to 1080 in most if not all of these cameras?

Tom Roper
Tom Roper
Guest
October 23rd, 2018
Reply to  Gunther Machu

Gunther, VLog-L for GH5S is limited to 12 stops, does not write beyond 10 bit code value 768 (or close). Will you consider a retest using HLG (hybrid log gamma)? N/R = -5?

Tom Roper
Tom Roper
Guest
October 25th, 2018
Reply to  Gunther Machu

HI Gunther, thank you for reply but more importantly, rating so many cameras objectively. Well done!

Narek Avetisyan
Narek Avetisyan
Member
October 23rd, 2018

Fantastic article, thanks! Gives great insight into how DR is tested.

I would also like to see some Pocket 4K test next for DR and rolling shutter. As from what I’ve seen the Pocket 4K performs incredibly well in RS. Even better than the X-T3. I think it will top the charts in that department.

Thanks for your hard work.
Cheers

Alex
Guest
October 23rd, 2018
Reply to  Gunther Machu

Very soon is this week? :D

Dennis Schmitz
Guest
October 29th, 2018
Reply to  Alex

Maybe today?

Dustin Jenkins
Dustin Jenkins
Guest
October 23rd, 2018

This is great! I wish more people had a detailed and honest view of dynamic range and acknowledges that tests like this are great but aren’t everything. Look forward to seeing many more.
Also, those are some really surprising results for some of those cameras. Was the FUJIFILM X-T3 tested at ISO 640 or 800? Just out of curiosity, there seems to be a DR/noise trade-off.

Dustin Jenkins
Dustin Jenkins
Guest
October 27th, 2018
Reply to  Gunther Machu

I’ve been playing with the X-T3 footage and it seems you have to conform it to broadcast legal 16-235 range, otherwise it clips whites and blacks in Premiere. How did you go about conforming this? I can’t find a simple way.

Lukas
Lukas
Guest
October 22nd, 2018

Great to see this in one place! This should be sticked tab on Your main page in my opinion because such great database is only provided by Your site!
I strongly recommend to check BMPCC Proress vs RAW but using highlight recovery because in those recovered highlights ALWAYS details are hidden. (I also use bmpcc for last 5 yers and pro res have less DR compering to RAW- I know it clips in pink but until you “overexposed” in proper range there are still details there believe me :D). Also I have noticed that even in GH5 I can “recover” 0,5 of stops maybe little less like 0.3 from highlights using HL slider in “A” Tab in davinci resolve check it! I’m so happy that this is finally here :D next stop BMPCC 4K :D cheers! Thanks for that work!

Max Mustermann
Guest
October 23rd, 2018
Reply to  Gunther Machu

pink? I am using the bmmcc and always using the highlight recovery tool (also tested it a lot) and never got any problem with it. With the latest firmware, there should be no problem. I am really sceptical about those high numbers by a7s or gh5 in v-log mode. it is just “super flat” but I am not sure if it is really “usable” and how the highlight / shadow rolloff behave. I work with DSLR and filmcameras and it allways seems that I can get around 10.5 stops out of a DSLR. Like canon. And the new Gh5 or A7s there is not really any improvement. And the next step up is a BM Pocket or bmmcc or bmcc. I´ve seen some compariosn shots (real live situations) … dark cafe, bright sunlight .. and between gh5 and the old pocket and the pocket got so much more DR.

I know test charts are a good thing. But sometimes you cannot message “everything” by a test chart. It would be good to have a “real life” setup, too. A bright source, dark areas, shadows, but also glass objects with reflections, details in bright areas. Etc.

I would love to see a test chart AND a real life setup side by side.
Also loocking forward for the Pocket 4k. Anyway. It is great to see you guys are working on a “DR system” Even if it is not 100% perfect (and never will) it is good try to mesassure it because DR is important. More important than pioxel count.

tuxzilla
tuxzilla
Guest
October 22nd, 2018

Thank you for offering this objective testing. What I would love to see is not just the total stops of dynamic range, but how those stops are distributed. It doesn’t help me to see 10 stops into the shadows if my highlights are always clipping :-)

アミル サファリ
Guest
October 22nd, 2018

In the face of all those imbecile fanboys who believe everything what the manufacturer claims.

Rob
Rob
Guest
October 22nd, 2018

Awesome! I would love to see the results of a test from more cameras. My top pics would be 5d3 with Magic Lantern Raw, C200, BMD P4k, EVA-1. Thanks!

Alex
Guest
October 23rd, 2018
Reply to  Gunther Machu

Can’t wait for the BMPCC4K results. My guess ys around 12.8

Dennis Schmitz
Guest
October 29th, 2018
Reply to  Alex

I expect it to be pretty good as well.

Max Mustermann
Guest
October 22nd, 2018

Great news to see and upgrade to the Dynamic Range test. But I don´t understand the part about “Image Downscaling”. Stzeve Yedlin (DoP, Looper, Star Wars, etc) did a Rsolution Comparison test. You can watch it here:

PART 2 is the “donscaling part”
http://www.yedlin.net/ResDemo/index.html

He is downscaling a 4k (also 6k, 8k) down to 2k, and than back up again to 4k to see if there is any difference in the image. Without pixel peeping there is no difference. But in this article Gunther Machu wrote that you can get more DR by Downscaling an image. How is this possible? Because the image got softer .. so less noise. And less noise = more DR. I don´´t belive it.

It would be interesting to recreate the tests which Yedlin did. Downscale the UHD image from a Sony FS7. To 1080p. Then scale it back up to 4k. And see if there is any difference side by side. I don´t think there will be any difference unless you are zooming in. But I don´t think you will get more DR.

I am also confused about the BMPCC 1080p. ProRes and raw has the same DR. 12.5 stops?! The noise from raw is complelty different than prores. And in raw you have highlight recovery options. And shadow rollof, etc. It must be like 12.5 stops vs. 12.6. At least 0.1 difference. Maybe 0.5 up to 13 stops. I did tests with the BMMCC and I got more DR when shooting in raw.

Gh5 = 10.8 stops
GH5s = 12 stops

wtf? NEVER. I´ve seen so many comparision videos and the DR between Gh5 and Gh5S looks the same. maybe 0.1 stops. But never 1 stop.

Sony A7s II with 12.7 more stops than the old Pocket Camera?!

Again… NEVER. I´ve been using, RED, Arri, but also Canon, Sony, Panasonic, Ursa Mini Pro , BMMCC, etc. Maybe I don´t understand the list. But this cannot be true.

andi
andi
Guest
October 23rd, 2018
Reply to  Gunther Machu

what the dynamic range method test based ? on pre edit (out of camera ) or post pro (after grading,etc) ? Because this two method are different animal

FilmKit
Guest
October 22nd, 2018

Glad to see the cinema5D Lab is back! I saw the chart with recent results, but is there a page where I can… https://t.co/Hfdfz5eLKt

 Michael Della Polla
Member
October 22nd, 2018

Where are you based? Any way i could supply my Z Cam e2 for testing ? I’m in nyc.

Johnnie Behiri
Admin
October 22nd, 2018

Hi Michael.
We are in Vienna, Austria….
In any case, it will be nice to hear your impressions using the new Z Cam e2.

Thank you
Johnnie

Dsx
Dsx
Guest
October 22nd, 2018

heck yeah! Finally 3rd part non partisan technical tests! :) Can we see the UMP, Eva-1, C200, C300 MKii, F5, F55, Red Gemini, Red Raven, and BMD PMPCC 4k?

Derek McCabe
Derek McCabe
Guest
October 22nd, 2018

Great article, well written.

1
1
- 3
Filter:
all
Sort by:
latest

Take part in the CineD community experience