Placeholder Image

Subtitles section Play video

  • What's up, Josh here.

  • So Apple just finished up WWDC 2024, where they announced several major updates to iOS, macOS, iPadOS, and some minor updates to their other platforms.

  • There were some good quality of life updates that we've been asking Apple for for quite some time now, including scheduled messages, better home screen customization, and even an iPad calculator app, something I'd never thought I'd see.

  • But for the last 40-ish minutes or so,

  • Apple talked about Apple Intelligence, which is their brand new suite of AI-powered features that has been highly anticipated for some time now.

  • And so in this video, we're gonna try to break down everything that Apple Intelligence can do and how you can take advantage of those features coming in the next few months.

  • Make sure to like the video, subscribe if you aren't already, and let's get into it.

  • So let's just start with the basics.

  • What is Apple Intelligence?

  • So Apple Intelligence is, not to be confused, with Siri.

  • You can sort of think of Apple Intelligence as the foundation or the groundwork or the engine behind how all of these

  • AI-powered tasks are powered.

  • And the way that they're able to be powered is through Apple's latest advancements in Apple Silicon.

  • That means that in order to take advantage of Apple Intelligence, you need to have a device that has the latest Apple Silicon, like the M-series chips and the A17 Pro, which is currently only limited to the iPhone 15 Pro, which I understand is a little bit disappointing for those of us with older hardware, considering that even the 14 Pro is not even two years old at this point.

  • But there is a good reason for why it's limited to those devices, which we'll cover later on.

  • So now you might be asking, what can Apple Intelligence do?

  • And I think the first bucket of features that we all probably do daily is with writing.

  • So I'd say the biggest announcement with Apple Intelligence in terms of text is what's called Writing Tools.

  • This is a new system-wide tool that you'll be able to use in any app and brings a couple of features that we've already probably seen before in other third-party AI-powered apps.

  • So as you can see, when you're in a text field or anywhere where you're able to type, you can actually bring up this menu, which gives you a couple of different options.

  • For example, you can either rewrite something in a different tone, fix grammatical errors, or summarize a piece of text.

  • And then if you wanted to request something more specific, you can just write out your request at the top.

  • This is not anything new.

  • We've seen these features in other apps as well, but I think the biggest game changer here is that it's actually system-wide, meaning you don't have to go into a separate app to use these features, nor do you only have to be in an app that's made by Apple.

  • As long as you're using an Apple device, like an iPhone, Mac, or iPad, you're gonna have access to these writing tools.

  • And this one feature alone,

  • I think as insignificant or as basic as it might seem, is actually a pretty big deal because it sort of eliminates the need for any third-party subscriptions that you might have to other apps that might do the same thing.

  • But also this opens the door for generative text to just so many other people.

  • Think about how many people use iPhones, Macs, and iPads, and how many people are just going to have access directly on the device for generative text.

  • It's been shown time and time again that when Apple does something, that that thing sort of just catches on like wildfire.

  • And so I think pretty soon we're gonna see a lot more people utilizing generative text.

  • So yeah, that is writing tools, but then there's also what's called smart replies.

  • This is a feature in Apple's new Mail app, which uses AI to draft up a response to an email, and will even come up with custom prompts based off of action steps or questions in the emails.

  • So instead of going into writing tools and then asking it to generate some email that says that you are going or you aren't going, this new smart reply integration will actually just give you some toggles, which you can say yes or no, or reply, and basically allows you to reply to an email with just a couple of taps and probably some tweaks, which is actually pretty crazy, and I personally can't wait to take advantage of this feature.

  • But then one subcategory of writing, or I guess text in general, is summarization.

  • And with Apple Intelligence, there's a couple of ways that they're summarizing pieces of text.

  • So again, within the new Mail app, you'll now have a section dedicated to priority emails, which are just your most important time-sensitive emails.

  • Then at the top of a specific email, you can, of course, summarize that email with a single tap, which is gonna be super handy.

  • But also, in every single email, you might have noticed that they now all include a high-level summary of the contents of that email, rather than just displaying the first few lines, which are often not super helpful.

  • The same thing goes for notifications, having a priority section for all your most important notifications.

  • And then within each notification are summarizations, which are gonna be great for catching you up on group chats.

  • I think these summarizations are gonna add up to a lot of saved time in the long run, and are really gonna make a significant difference in terms of productivity and efficiency.

  • Now, the second bucket of features that are powered by Apple Intelligence are everything that has to do with images.

  • One feature coming to the Photos app is called Cleanup, which is similar to Google's Magic Eraser, which automatically identifies and removes distracting elements from the background of your pictures.

  • I mean, we sort of saw this coming.

  • Google's been doing it for a while now, but it is nice that we are finally getting it in the Photos app.

  • And then another feature is turning your entire photos and video library into something actually searchable with natural language.

  • So now you can just describe what you wanna find and it'll find it.

  • Again, this is another feature that Google has done for a while now.

  • And for me personally, I'm already pretty invested in the Google Photos ecosystem.

  • So not sure how much I'll be using this one, but it's nice to see that it's there.

  • But then Apple announced something called Image Playground, which sort of reminds me of Google's AI Wallpaper feature, where you pick from a theme and then fill in some words to generate an image as a wallpaper.

  • But I think Apple did it a little bit better here because instead of just generating wallpapers, you can actually generate images to be used in messages.

  • So just like Google's AI Wallpaper, in Image Playground, you can tap on these broad themes or broad concepts.

  • And when you tap on them, they get added to your playground where an image then gets created for you in just a couple of seconds to share in messages.

  • If you want something a little bit more specific, you can also just describe it using text, which then gets added into your playground.

  • You can pick from three styles, which are animation, illustration, and sketch, and even choose to incorporate people into these generations to make them feel more personalized to that conversation.

  • So if you wanted to generate a picture of your mom in a superhero cape because you appreciated her so much, you can now actually do that because Apple Intelligence knows what your mom looks like.

  • Oh!

  • Based off of what I assume is your photo library.

  • I think for those moments where we can't find the perfect GIF or we want something to feel a little bit more personalized, this is gonna come in clutch.

  • But then possibly the most Apple thing that they could have done is roll out Genmojis, which stands for generative emojis.

  • So this is another way of utilizing generative images built directly into the keyboard where you can now just type out a description of an emoji and it'll give you that exact emoji in the classic Apple illustration style of emojis.

  • And again, you can also now create these Genmojis based off of real people in your photos library.

  • And honestly, this is something that I never thought we would see.

  • This basically changes the emoji game forever.

  • Now, one question I have is how does it work on the backend?

  • Because emojis typically have like a Unicode or some sort of zeros and ones to identify that emoji.

  • So that when it comes to another operating system like Windows or Android, it can point to that code and then spit out like a similar emoji.

  • But then in this case, it's just creating emojis out of thin air.

  • So I don't really know how these emojis are gonna show up if you send it to an Android device or if you view it on a Windows device.

  • But regardless, this is definitely something that I'm excited about.

  • And then finally, we have maybe the most impressive part of Apple Intelligence, which is the third bucket, actions.

  • So on top of helping us write and summarize text and maybe generate some images, the one thing that we've been dreaming that our phones would be able to do is actually be a really smart assistant that incorporates everything that these phones know about our lives and makes it useful.

  • And I can finally say that we are getting a glimpse into what that might look like with iOS 18 and Siri.

  • So Siri now has a brand new look with this new animation for when you prompt it with this glowing border around the entire screen, which sort of gives the vibe that it's integrated into your phone and that it's almost like living inside of it.

  • At least that's how I feel.

  • And one thing that's largely improved with Siri is that it has a much better language model, which will understand you even better.

  • So even if you stumble on your words or you say the wrong thing, you can actually correct yourself mid-sentence and Siri will understand you.

  • What does the weather look like for tomorrow at Muir Beach?

  • Oh wait, I meant Muir Woods.

  • The forecast is calling for clear skies in the morning near Muir Woods National Monument.

  • But now on top of these very basic surface level requests that we're used to doing with Siri, the new Siri powered by Apple intelligence is actually able to use context clues based off of your personal information in different apps.

  • So this includes stuff like your calendar events, your emails, but not just your emails, but also the attachments in those emails, also your text history, your photos, your notes, and it can even look at your screen if you wanted it to.

  • Now, the example they gave is if you got a work email to push a meeting to 5 p.m.

  • and you're afraid it might conflict with your daughter's play later in the day.

  • Well now, instead of having to pull up the play details that your daughter sent you, whether through text or email or wherever she sent it,

  • Siri can now just find that for you, cross-reference that data with your meeting time, and also look into the commute times to help you figure out whether you can actually make that play or not.

  • Which I think this is insane.

  • If it actually works as intended, this honestly reminds me of Jarvis from Iron Man or just a really smart, actually smart assistant that knows everything about you and can help you throughout your day.

  • Another example that makes use of Siri being able to look at your screen is, let's say your friend sends you their address.

  • Well now, you can just ask Siri to add that address into that contact.

  • Or maybe you're filling out a form and you need to find your driver's license number.

  • Well now, Siri can pull that up for you, assuming that you have a photo or scan of your driver's license stored somewhere on your device.

  • Now this brings us to privacy because obviously there is a huge concern with allowing AI to look into all of your documents.

  • And we've already seen some pretty big blunders when it comes to this stuff, specifically with Microsoft and their recall feature.

  • So how is Apple handling privacy when it comes to all the sensitive information?

  • One thing that Apple mentioned is that Apple Intelligence was designed with a strong emphasis with on-device processing.

  • This means that for a lot of the tasks, the device is actually able to process everything on device without collecting or storing any information or bouncing anything to an external server.

  • And this explains why Apple Intelligence is only rolling out to the latest Apple products with the latest Apple Silicon with dedicated processing for these complex AI tasks.

  • And also with going with this route,

  • I assume there's other benefits apart from just privacy.

  • For example, latency and speed, being able to process everything on device means you don't have to rely on an internet connection to bounce things back and forth from the server and back to your device.

  • And also I'm assuming that operating costs are cheaper, not having to process potentially hundreds of millions of Siri requests on your own servers, but rather just having all of that done on the user's device could save a lot of money.

  • But not all requests are gonna be handled on device.

  • So for more complex stuff,

  • Apple talked about private cloud compute, which only uses the data strictly necessary for the specific tasks you're requesting and never stores any information or makes any of that information accessible to Apple.

  • Apple also says that independent experts can inspect the code that runs on these servers to verify these privacy promises.

  • Last but not least,

  • Apple announced their partnership with OpenAI and now directly integrates ChatGPT within Apple Intelligence.

  • So for more research focused requests, like if you ask Siri to come up with a recipe based off of some ingredients that you have, it'll now give you a button that says use ChatGPT, which then bounces your questions to ChatGPT and gets you an answer right there on the spot.

  • You can also share photos with your questions.

  • So if you're looking at a photo and you're asking questions about that photo,

  • Siri will now ask you if you wanna share that photo with ChatGPT.

  • This also works with documents, presentations, and PDFs, and is all gonna be powered by ChatGPT 4.0, completely free without needing to create an account.

  • Overall, to sum it all up,

  • I think Apple Intelligence is a good step forward in terms of Apple's overall vision into the future with AI.

  • I think with these types of demos, we sort of have to take them with a grain of salt because we don't really have them in our hands yet.

  • So things like response times, overall accuracy of the answers, and overall usefulness of these features are still up in the air.

  • Also, one interesting observation that I had is that they never really even showed off the voice of this new Siri.

  • I think in the entire demo, they never allowed us to actually listen to what she sounds like.

  • But yeah, definitely an exciting few months ahead of us in both iOS, macOS, and iPadOS, as well as all of the other things that we didn't even get to cover in this video.

  • So definitely get subscribed if you aren't already for more Apple content.

  • But what do you guys think of Apple Intelligence?

  • Do you think Apple under-delivered or over-delivered?

  • Let me know in the comments.

  • Leave a like on your way out.

  • Again, subscribe if you aren't already, and I'll catch you guys in the next one.

  • Peace.

What's up, Josh here.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it