Subtitles section Play video Print subtitles - Apple kind of surprised everyone with the iPhone 14 Pro, right? I mean, there had been tons of rumors about the notch turning into a cutout, or two cutouts, but no one was expecting the dynamic island. And while it was getting clearer that Apple would have to follow the rest of the industry and using bigger camera sensors eventually, I certainly wasn't expecting the company to reboot the entire iPhone computational photography system as the Photonic Engine. There's a lot of that sort of thing in the iPhone 14 Pro. Apple's late to having an always on display, but this one's way more on than other phones. In the United States, Apple's going all in on Essen, which no one else is really doing. There's a basic satellite connectivity system, which isn't quite like anything else we've heard about. But Apple's going to ship millions of these phones, with the service coming later this year. All in all, there are more beginnings of big ideas in the new iPhone 14 Pro than we've seen in an iPhone for a long time. That's the easiest way to think about the iPhone 14 pro, it feels like the first step towards a lot of new things for Apple and the iPhone, and maybe the first glimpse of an entirely new kind of iPhone. But that doesn't mean all these things are perfect yet. Let's take a look. (bright music) We have to start with a Dynamic Island, right? The name is ridiculous, but it's fun. Everyone's talking about it, which is not normal for a smartphone status indicator system. That's a win, that's where I live every day. But let's just agree that we're gonna call it the island and move on. So the island replaces Apple's notch. It's where the front camera and the face ID system lives, since they've got to take up someplace in the front of the display. Here's the thing about the notch though, in almost every review, going back to the iPhone 10, I've said, "Hey after a couple minutes, "you don't notice the notch." The island is different. You are supposed to notice it. It's located lower on the screen than the notch. And if you run your phone in light mode, like I do, it's actually a high contrast interface element. It's a black pill shape in the middle of white screen. It's right there. You're going to see it. Especially since it's animating and moving all the time. Now it's better in dark mode. In fact, I would go so far as to say this is the first iPhone you should definitely run in dark mode because of it. So why did Apple go from the notch is there, but you can all but ignore it, to the dynamic island is here and you are going to pay attention to it? Well, it turns out that over the years, there have been like three or four different status indicator systems added on to iOS. Plugging in a charger or flipping the mute switch brings up an overlay. Having a call in the background puts a green pill in the corner. The maps app is a blue pill. Connecting AirPods is another overlay. Some things like timers and music playing in the background haven't really had status indicators at all. The island is Apple's way of replacing and unifying all those older status systems with a new home for system alerts, and making it work for things like music and the new live activities API that's coming to iOS 16 later this year, which will allow apps to share even more background info for things like your flight status or a sports score. The simplest way of understanding the island is that it's basically a new widget system built on top of that live activities API. And the widgets can have three views, the main view, this expanded view, and this ultra minimal icon when you've got two things going at once. Here's a list Apple sent over of all the things that will support the island at launch before live activities arrives and third parties can use it too. So that's the concept. The execution is obviously where the action is. And like all first versions of anything, Apple's made some choices that really work, and some others that, eh. So here's a big choice that really works, and it's just like purely Apple. One of the reasons that it's called the island is that it's meant to float over the rest of the system. It's a layer on top of iOS. It's supposed to feel more like hardware than software, almost like a secondary display that can get bigger or smaller. To get this to feel right, apple's actually using a new dynamic sub-pixel anti-aliasing system that makes the edges of the island up to three times crisper than all the other animations in iOS. Here's a macro shot of the individual RGB sub-pixels of the display being anti-aliased to create a sharp edge for the island. In normal room lighting, this really works. It feels like the cutout on the display is getting bigger and smaller. And the animations like this drop effect for when the island splits in two are super fun. In sunlight or brighter light, yeah, you can see the camera sensors, and the illusion kind of goes away, but it's still cool. Other big thing that works is that moving all these disparate status indicators to the island and making them worth paying attention to is actually pretty great. It's nice having call info right on the screen. It's genuinely useful having your timers right there, making things like Airdrop and Face ID all show up in consistent ways in the same place make those things easier to understand, which is great. The thing that kills me is that in a keynote, and in all the ads, Apple shows the island is a thing that's worth interacting with. It's always moving, and going back and forth between the main view and the expanded view. In reality, it is not like that at all. The island isn't a primary interface element. It sits over whatever app you're actually using. And apps are still very much the main point of the iPhone. In fact, tapping on the island doesn't open that expanded view. It just switches you back to whatever app that controls the widget. To get the expanded view, you have to tap and hold. This feels exactly backwards to me. Now Apple knows I feel this way. The idea apparently is that things should be as simple as possible. And going back to the app is the simplest thing. Nah, I don't know. I think a tap should pop open the widget, and I definitely think you should at least be able to choose. This is kind of the whole tension of the island. It's much more noticeable and useful than the notch, but you're not really supposed to interact with it. It's background information. All those questions about whether you're gonna get fingerprints all over the camera, like, well, as it stands, you don't touch this thing very much at all. But because it's so much more prominent, you're looking at it all the time. I'm using it with a bunch of apps that haven't been updated. So it kind of covers up some content because it sits lower on the display. So right at the second, the trade off between how noticeable it is and how useful it is is a little out of whack. It doesn't quite do enough to always be in the way. Now I think all this might change when that live activities API rolls out later this year, which is the other big thing Apple did right. It made this system available to third party developers. But right now the dynamic island feels like one of those things that needs a year of refinement and developer attention before we really know how important it is. The big feature of the iPhone 14 Pro camera system is the new 48 megapixel main camera sensor. Apple's actually a few years late to this trend. Samsung has used 108 megapixel sensors since 2020, and Google added a 50 megapixel sensor to the Pixel 6 Pro last year. The basic idea is the same all around, to take better photos, you need to collect as much light as possible. And to do that, you need bigger pixels. But at some point making the pixels physically bigger gets hard, so instead you just add a lot more pixels and use software to group them into giant virtual pixels. This whole thing is called pixel bidding. And the math on Apple's bidding is straightforward. It uses four pixels to create a single virtual pixel, which means that 48 megapixel sensor generally shoots 12 megapixel photos. Apple's also reworked its entire photo processing pipeline and rebranded it the Photonic Engine. The big change here is that the deep fusion pixel by pixel analysis that happens in mid and low light now happens earlier in the process on uncompressed data. It's never really been easy to see how much it's doing. And well, it's the same on the iPhone 14 Pro. Honestly the 14 Pro and 13 Pro take really similar photos. Here's Verge video producer, Mariya Abdulkaf in a really dim bar. The 14 Pro is a little cooler, and it captures a tiny bit more detail at 100%, but you really have to go looking for it. These photos of Mariya outside look pretty much the same. But if you zoom in, you can see the 14 Pro is getting a bit more detail, and it has a nicer background blur because of the substantially larger sensor. This is all really nice, but in Instagram sizes, it is not particularly noticeable. Here's that same photo on the Pixel 6 Pro, by the way, you can see it captures even more detail with its Pixel Bend 50 megapixel sensor, along with a wider range of colors. This is about as different as the Pixel and the iPhone have been in a few years. They both grab a lot of detail and have great low light performance, but the pixel six pro makes very different choices about highlights and shadow, while the iPhone is way more willing to let highlights blow out, and even let some natural vignetting creep in. Both of these photos are terrific, and the one you prefer is entirely down to subjective preference. Where the iPhone 14 Pro falls down is really in the details of the processing. Apple's been ramping up the amount of noise reduction and sharpening over the years. And the 14 Pro has the most aggressive sharpening and noise reduction yet. Sometimes it just looks bad. This night skyline shot is kind of an over processed mess compared to the Pixel. Compared to the Samsung S22 Ultra, the iPhone is a little less predictable. The S22 Ultra consistently holds onto more color detail in low light, and it's not as heavy handed with that noise reduction and sharpening. In bright light, the differences between the 14 Pro and the S22 Ultra are more subtle, but Samsung still does a better job with detail. In true Samsung fashion though, you get much punchier and warmer colors compared to the more natural look of the iPhone. I mean, look at these sunset photos. Samsung's color ideas are sometimes from an entirely different planet, but photo for photo, the S22 Ultra is more consistent with better fine detail. Now Apple isn't just pixel bidding its sensor, it's also cropping it to generate what it calls an optical quality 2X zoom. Basically it's just taking the middle 12 megapixels off that 48 megapixel sensor. If you shoot in pro route the full 48 mega pixels, and just cut out the center of the image, you don't get the same photo. You don't get the benefit of pixel bidding in 2X mode, so it gets into a little trouble in lower light situations, but it's nice to have, and it's a really nice middle ground between the standard and the 3X tele. That 2X crop is now the default for portrait mode, which doesn't seem to have improved all that much. Both the S22 Ultra, and even the regular S22 take better portrait photos. Samsung's really nailed cutting the subject out of the background, down to individual strands of hair, and 14 Pro isn't quite there yet. You can also switch the whole camera to ProRAW mode, and shoot in full 48 megapixel glory, which generates massive DNG files anywhere between 50 and 80 megs each. If you're the sort of person who wants to do this, this is incredibly exciting but normal people should probably not shoot 48 megapixel photos on their phone. The other big update is to video, but, you know, I called in the expert for that. - Hmm. I'm just kind having a hard time telling the difference. The iPhone 14 Pro, of course has great video. I mean, colors are punchy, and bold, with just the right amount of sharpness. But all of the same could be said about the iPhone 13 Pro. And outside of a much sharper image in low light from the telephoto, I'm not really seeing a large jump in performance between the two. But Apple has three gimmicks for us this year. First, there's now an active mode that is supposed to provide more stable footage. The Pixel 6 Pro had a similar feature, but I find that video footage from most flagship phones in 2022 is already pretty stable, certainly stable enough for mobile viewing. So I never used it there, and I probably won't use it here. Second, there's this sort of smooth zoom feature that smooths out the transitions when you're switching between lenses. It's cool once you notice it, but I think the most folks won't notice it at all. And lastly, there's 4k cinematic mode. It blurs the background just well enough that I actually found myself always trying that mode first when filming a video of a person's face. With anything else, I mean, plants, signs, forget about it, it's just not that smart yet. But does it matter that it's in 4k? For me, it kind of does, but I'm a person who's gonna take that footage into premier. Anyone else? I don't think it matters. Overall though, the 14 Pro's video is not upgrade worthy if you're coming from an iPhone 13 Pro. And I think it would've been a little bit more beneficial if Apple spent more time attempting to get rid of all of those light reflections at night. And you know, Apple while you're at it, why don't you add a button to your native camera app that allows you to switch between the front facing camera and the rear camera when you're filming a video? I mean, Samsung has had this feature for as long as I can remember. It's time the iPhone gets it too. Okay, back to the review. - Apple's made some other big changes to the iPhone 14 Pro. The phone now has an always on display mode, which Android phones have had always on displays for a long time now. It's fine. The display refresh drops to just one hertz, and the brightness goes super low to say battery life. Apple's done some really nice work here to keep wallpaper colors accurate in the low power mode. But honestly, I've been fooled into thinking my phone is awake one too many times, and I might prefer a much simpler pixel style black and white clock. I hope we see some customization options here in the future. Other than that, the display is slightly brighter than before, it can hit a peak brightness of 1600 nits up from 1200 in 13 Pro. And in bright sunlight, it can go all the way up to 2,000 nits. You all know I think Apple's mobile displays are consistently the best in industry, and it's no different here. This is the part of the video where I'd usually talk about connectivity. But, you know, can we all just admit that the 5G hype balloon basically popped. Like I turn it off and just use LTE because those networks are less congested, sorry. Let me know when the self-driving cars are doing robot surgery or whatever. Anyway, Apple's made a big move to drop SIM trays from iPhones in the United States, which means it's time everyone got used to eSIM, which lets you access mobile networks without needing a physical SIM card. You can sign up for up to eight different networks on the iPhone 14s. It's pretty rad. My iPhone 14 Pro transferred my AT&T account over from my 13, right over Bluetooth. And I added my GoogleFi account with just a handful of taps. Now it's not nearly as easy to move eSIM info from iPhones to Android phones and back again. And carriers are certainly gonna place some weirdo lock-in games here, because they're carriers, and weirdo lock-in games are kind of why they exist. But being able to add new networks quickly and easily to your phone also theoretically means we can all force the carriers to compete a little more. That's definitely a good thing. Speaking of connectivity, Apple's emergency satellite connectivity system isn't rolling out until later this year. But Allison got an early demo on the Apple campus, and it looks pretty slick. The system walks you through a series of questions to help first responders understand your situation. Then it shows you where to point the phone to access satellite. We've got way more on that in her review of the iPhone 14. Speaking of Allison, she's also been testing the iPhone 14 Pro, while Becca and I have tested the iPhone 14 Pro Max. And all three of us sort of feel like the battery runs down a little bit faster than the 13 Pro. Now, to be fair, all three of us were running around taking lots of photos and videos, and generally testing the hell out of these phones for the past week. But you know, we test a lot of phones like that. Apple says the 14 Pro and 14 Pro Max should gets slightly better battery life than the 13 Pros. And yeah, I still got through a full day with a 14 Pro Max. So maybe it was just the always on display taking a toll. But it's something I'll be keeping my eye on in the future. So that's the iPhone 14 Pro. The way I've been thinking about it is that the iPhone 13 Pro was the culmination of a lot of ideas for Apple. It was confident and complete and kind of hard to criticize. The iPhone 14 Pro on the other hand is the clear beginning of lots of new ideas, like the dynamic island, the new camera, even that satellite connectivity system. Because these ideas are new, they're inherently incomplete, so there's lots to criticize. But they're worth criticizing, which is its own kind of victory, and a sign that Apple isn't holding still with the future of the iPhone. I'm into that. What I don't know is if all these new features are worth it yet. If you're the sort of person who's willing to accept some rough edges to be on the bleeding edge, you're gonna have a lot of fun with the iPhone 14 Pro. You'll be figuring it out right alongside Apple. But for everyone else, might be worth holding off of a year. - Look at these clouds. Oh my God. Look at that. Look at those. Oh my God. This stuff over here. Come on. That is just so special.
B1 iphone pro iphone pro apple pixel island iPhone 14 Pro: early adopter island 312 10 林宜悉 posted on 2022/10/31 More Share Save Report Video vocabulary