Focus Check podcast ep72: What are your ULTIMATE mirrorless camera features? - WATCH or LISTEN now!
Focus Check podcast ep72:The ULTIMATE mirrorless camera?
Education for Filmmakers
Language
The CineD Channels
Info
New to CineD?
You are logged in as
We will send you notifications in your browser, every time a new article is published in this category.
You can change which notifications you are subscribed to in your notification settings.
What if you could step inside your footage and feel like you’re actually there? Apple’s Immersive Video workflow and format for the Vision Pro, paired with Blackmagic’s new URSA Cine Immersive camera, promises exactly that—an unprecedented leap beyond traditional filmmaking into truly spatial storytelling. We took a close look at this revolutionary workflow, from capture to color grading, to see how it could change the way filmmakers craft stories for the screen—and beyond.
Blackmagic’s URSA Cine Immersive has staggering specs. It has two 8k sensors running at 90 frames per second. A typical 4k camera running at 24 frames per second processes 212 million pixels every second. The URSA Cine Immersive processes 10575 million pixels each second!
To understand why such a camera exists, you first need to understand the platform which it is designed to service.
In Apple’s own words, the Vision Pro is a spatial computer. It allows you to do things like watch movies on a screen as big as your house, interact with apps like they’re a physical object in your room, or work on a plane without feeling like you are trapped in a cupboard. All very cool, but we’re focusing on the Vision Pro purely as a device for consuming entertainment.
The Vision Pro supports a wide variety of media formats. We’re specifically interested in Apple Immersive Video, a 180-degree stereoscopic video format designed specifically for the Vision Pro. It’s unlike any other format you might have experienced. Whereas traditional 2D media allows you to feel emotionally present, Immersive media allows you to feel physically present.
To achieve that level of realism, the visual experience is designed to match and in some cases exceed the capabilities of human vision. There are two key specs that make this possible.
First, it all starts with the Vision Pro’s displays, which have the highest pixel density of any OLED display. 54 Vision Pro pixels fit in a single iPhone 16 pixel. That pixel density is important, as visible pixels would break the feeling of immersion.
Second, it’s commonly accepted that human vision tops out at around 70 – 80 frames per second. Anything beyond that is imperceptible. Therefore, Immersive video runs at 90 frames per second.
Acquisition needs to meet the demands of the display system. This brings us full circle back to the URSA Cine Immersive.
When compared to traditional 4K DCI DCI capture, Apple Immersive Video has approximately 5x more pixels, 3x the frame rate, and then finally multiply everything by 2 because it’s stereoscopic. That’s over 30 times the data rate!
Many previous stereo capture systems were complex. Some required two cameras and used large, complex rigs. That complication continued into post-production as each camera produced its own separate files that must be synchronised and re-projected into a common format.
This added complexity to the post-production process but also degraded the image. It’s similar to the generational loss that happens when re-encoding video files. Each time you touch the image, artefacts are introduced that compound until the final image is delivered.
On the other hand, the workflow with the URSA Cine Immersive and DaVinci Resolve is much simpler and requires fewer workflow steps.
The URSA Cine Immersive captures a separate fisheye image for each eye.
The term ‘Lens Space’ is used to refer to an image in the native format as captured by the lens. Most 180 or 360 cameras also use fish-eye lenses. However, at some point before the image is distributed, it is converted from the camera’s ‘Lens Space’ into a common format like Equirectangular (also known as LatLong).
That’s not the case with Apple Immersive Video though! Would it surprise you to know that the finished image is delivered to the Vision Pro in lens space? A finished Apple Immersive Video contains the original fish-eye images captured by the camera’s two lenses and sensors. There are two main reasons why:
The goal of Apple Immersive Video is to present the highest fidelity experience possible. That’s why images shot for Immersive are preserved in their native Lens Space all the way through to the Vision Pro, where the images are re-projected for the first and only time.
To work together on a stereoscopic camera, each lens and sensor must be profiled so minor discrepancies can be corrected when the image is finally reprojected. This is especially important when you are attempting to deliver quality that mimics human vision.
Remember that re-projection only happens once, on the Vision Pro. When the camera records, it embeds its factory calibration into that Blackmagic RAW file as metadata so that it can be read by DaVinci Resolve. When you render out your Immersive video file from DaVinci Resolve, it embeds the camera’s calibration metadata into the video file that will get played on your Vision Pro. Because more than one camera might be used on a shoot, DaVinci Resolve actually embeds separate metadata for every single clip in the Immersive timeline.
When you play back that file on your Vision Pro, it is dynamically unwrapping and re-projecting every single clip with its own metadata.
Herein lies one of Blackmagic and Apple’s greatest innovations in the immersive video workflow – all of this happens automatically and without user intervention, thus removing one of the biggest friction points in legacy stereoscopic production workflows.
It’s hard to describe the Apple Immersive experience. It’s something that you have to experience yourself. If you’ve not tried a Vision Pro, ask a friend, or go visit an Apple Store (click here to book a demo) and watch your way through Apple’s growing library of Immersive content.
It’s not just exciting for consumers, it’s also incredibly exciting for filmmakers because it represents one of the biggest leaps in filmmaking technology that we’ve seen for a while. Like the invention of the cut or the advent of sync sound, technological innovations have a tendency to shape the way stories are told.
We’re on the cusp of another revolution. Some of our 2D storytelling techniques won’t work in this medium. But there are new techniques that have already been discovered and there’s yet more to be discovered, maybe by yourself.
If you’d like to learn more about the URSA Cine Immersive and the Apple Vision Pro, please watch our full video where we…
In case you missed it, we ran an interview with Blackmagic Design at NAB 2025 about the new camera and the workflow:
Recently, we published video interviews with the filmmakers behind the new Apple Vision Pro productions “Bono – Stories of Surrender” (Elad Offer, link here) and “D-Day: The Camera Soldier” (Victor Agulhon, link here). These interviews provide insights into the filmmaking process in this emerging immersive medium.
Have you tried immersive video yet? We’re curious- do you think this is the future of filmmaking, or just another passing tech trend? Let us know what you think in the comments below!
Stay current with regular CineD updates about news, reviews, how-to’s and more.
You can unsubscribe at any time via an unsubscribe link included in every newsletter. For further details, see our Privacy Policy
Want regular CineD updates about news, reviews, how-to’s and more?Sign up to our newsletter and we will give you just that.
You can unsubscribe at any time via an unsubscribe link included in every newsletter. The data provided and the newsletter opening statistics will be stored on a personal data basis until you unsubscribe. For further details, see our Privacy Policy
Leon Barnard is a director, cinematographer, and editor with a passion for training the next generation of filmmakers. Through Team 2 Films, he provides valuable insights into camera equipment and DaVinci Resolve, helping creatives refine their craft.