The New 2020 iPhones Have Hollywood-Quality Video Recording — Here's What That Means « iOS & iPhone :: Gadget Hacks
It’s common knowledge these days that Apple puts great cameras into their iPhones. So, it probably comes as no surprise to learn that the iPhone 12, 12 mini, 12 Pro, and 12 Pro Max sport some excellent shooters. What you might not know, however, is that these cameras are quite literally capable of Hollywood-quality video. So don’t be surprised to see one of these filming a Netflix series.
The late-2020 models aren’t the first iPhones to have the term “Hollywood” associated with them. Filmmakers have experimented with Apple’s cameras for years. Feature films like “Tangerine “and “Unsane” made their iPhone-only cinematography part of their marketing campaigns. But Apple’s latest offerings bring something completely new to the game — Dolby Vision and 10-bit video.
- Don’t Miss: The iPhone 12 Pro & 12 Pro Max’s Specs & Features
First of all, it’s not going to mean much to you that your new iPhone can shoot video with Dolby Vision if you don’t even know what Dolby Vision is. Simply put, Dolby Vision is a mastering and delivery format applied to video footage that gives you much greater control over color grading each frame. And since the new iPhone models can film in HDR, or high-dynamic-range, that tone mapping is even better.
HDR is a technology that allows videos to capture and display lights and darks in a much more enhanced and dramatic way. If you’re filming inside a dark room with a bright window, HDR captures more of the bright light from the window, in addition to the dark elements of the room. That same scene without HDR looks much flatter, perhaps with less detail in the window and the room’s shadows.
Now, when you shoot HDR video on any of the iPhone 12 models, what you’re really doing is taking a rapid series of pictures, called frames. You might shoot 24, 30, or 60 frames per second. Each video frame goes through your iPhone’s ISP (image signal processor), which then makes a histogram, a map of all the colors in your video. Your iPhone then takes that information from the histogram and color grades your video with Dolby Vision. That results in boosted colors, enhanced lights, and deeper blacks.
In layman’s terms? Dolby Vision is a super smart technology that can read your iPhone’s video and understand the best ways to improve the overall image quality. And it does it all while filming takes place, and you can see it happening right on your screen.
Dolby Vision isn’t the only HDR format out there. You might see other formats, like HDR10, plastered over TV models, smartphones, and tech reviews. And you might also see 4K everywhere and wonder how that factors into the mix.
As far as 4K is concerned, that’s a representation of the video’s resolution, or the number of pixels that the video contains. The more pixels, the more detail you can fit in an image. 4K is named as such because 4K video contains a horizontal resolution of roughly 4,000 pixels. It has nothing to do with your video’s dynamic range, just the amount of detail you can fit in the frame.
Dolby Vision and HDR10 are just different ways to analyze and enhance HDR in digital videos. One main difference comes down to the proprietarily of the formats; HDR10 is a license-free technology. That means any camera or TV manufacturer can add and use HDR10 with their products in any way they see fit. It also means there’s no standard, so you can’t trust HDR10 to be the same across devices.
Dolby Vision, on the other hand, is entirely created and controlled by Dolby Laboratories. To take advantage of the technology, manufacturers work off of Dolby’s standards and need to pay to use the format. It’s a more limiting and expensive ordeal, which is why you see HDR10 more frequently in the tech world than Dolby Vision, but it’s also why Dolby Vision is a Hollywood standard.
Dolby Vision is technologically superior. While HDR10 can display up to 1,000 nits of brightness, Dolby Vision can display up to 10,000. The same goes for colors; HDR10 caps out at 10 bits, while Dolby Vision goes up to 12 bits. That really isn’t a major consideration for you, though, since the new iPhones can only shoot up to 10-bit video at this time (more on bits down below).
Both HDR10 and Dolby Vision video can be 4K, but it doesn’t have to be. You can have HDR10 and Dolby Vision video at 1080p resolution, which is roughly four times fewer pixels than 4K. Your iPhone is capable of displaying both HDR and 4K HDR in either format, but you need an Apple TV or need to AirPlay to a compatible TV set to get the full 4K HDR Dolby Vision experience.
So, Dolby Vision is pretty sweet and all, but is it a reason to buy the new iPhones? After all, any iPhone since the iPhone 6S can shoot in 4K resolution. So, why make the switch?
As any camera buff would tell you, resolution isn’t everything. Your iPhone 6S might shoot 4K video just like the Hollywood cameras, but other than that, it’s nothing like those Hollywood cameras. And bits play a big part in the difference. The bitrate represents the number of bits of color information a single camera pixel can store in digital video. The more bits, the more color. Simple enough, right?
The iPhone 12 lineup shoots 10-bit video, while previous iPhones shot in 8-bit. That might not sound like much to the uninitiated, but those two bits matter. In fact, Apple claims the new iPhone models shoot video in 700 million colors, 60 times more color than 8-bit video. That’s 60 times more color information a colorist can use to make your footage look like magic.
Apple even lets you edit your Dolby Vision footage right on your iPhone. Image by Apple/YouTube
It’s not just the video itself that makes Dolby Vision special — the technology also tells displays like TVs how to present that video. Dolby Vision can instruct a TV to boost the brightness of specific areas of your Dolby Vision video in one scene and amplify the color in another. The LG CX series, Sony A9G series, and Vizio P-Series Quantum X are just some examples of TVs that feature this capability.
This technology powers a lot of professional videos and is present throughout the different places you might experience movies and TV. You might notice Dolby Vision’s symbol next to your latest Netflix obsession, or maybe you’ve attended a special Dolby Vision screening at a movie theater. It’s Dolby Vision — not just 4K — that the industry uses to trust that both the video and the viewing experience are pristine.
There’s a reason Dolby Vision is a standard in Hollywood. The industry relies on the technology to ensure that, from start to finish, the movie or TV footage can be distributed and viewed as intended. Cinematographers shoot the video to look a certain way, editors and colorists bring that vision to life, and distributors make sure that, whether you watch on a projector, TV, or smartphone, you get the full experience.
While your iPhone color grades your video automatically as you shoot, you might find that the footage doesn’t look exactly how you’d like it. That’s where editing comes in. Apple proudly announced that apps like iMovie, Photos, and even Clips could edit Dolby Vision video right on the iPhone. That means you get to tweak the colors and lighting in your shots, right on the phone that shot that footage in the first place.
Apple also announced that Final Cut Pro X would also have this ability in an upcoming update, but unfortunately, that software is still Mac-only. If Apple ever brings its pro editing software to mobile, your iPhone will truly be an all-in-one Hollywood machine. Until then, it still gets the Hollywood treatment.
Don’t Miss: All You Need to Know About the iPhone 12 & iPhone 12 Mini
Just updated your iPhone? You’ll find new features for TV, Messages, News, and Shortcuts, as well as important bug fixes and security patches. Find out what’s new and changed on your iPhone with the iOS 17.6 update.
Cover image by Apple/YouTube
Great info! It clears up a lot of questions. I was wondering though, what would the experience be for a Dolby video on a non-Dolby platform? Could a video be created using Dolby then exported in a format that will still look good on YouTube, social media, Vimeo, etc.? What’s that process like?