Placeholder Image

Subtitles section Play video

  • Mark welcome to your first SIGGRAPH How do you see the the advances of generative AI ad meta today and how do you apply it to either enhance your operations or Introduce new capabilities that you're offering with generative AI I think we're gonna quickly move into the zone where Not only is is the majority of the content that you see today on Instagram You're just recommended to you from kind of stuff that's out there in the world that matches your interests and whether or not you follow The people I think in the future a lot of this stuff is going to be Created with these tools to some of that is going to be creators using the tools to create new content Some of it I think eventually is going to be content.

  • That's either created on the fly for you Or or or kind of pulled together and synthesized through different things that are out there So I kind of dream of one day Like you can almost imagine all of Facebook or Instagram being you know Like a single AI model that has unified all these different content types and systems together That actually have different objectives over different time frames, right?

  • Because some of it is just showing you you know What's the interesting content that you're gonna be that you want to see today?

  • But some of it is helping you build out your network over the long term right people you may know or accounts These these multimodal Tend to be yeah tend to be much better at recognizing patterns weak signals and such and so one of the things that people people always you know, it's so interesting that AI has been So deep in your company.

  • You've been building GPU infrastructure running these large recommender systems for a long time Now you're now you're slow on it actually getting to GPUs.

  • Yeah, I was trying to be nice I know.

  • Well, tell everybody about the creator AI and AI studio that's gonna enable you to do that Yeah, so we actually I mean this is something that we're you know, we've talked about it a bit But we're rolling it out a lot wider today You know a lot of our vision is that I don't think that there's just gonna be like one AI model, right?

  • I mean, this is something that some of the other companies in the industry They're like, you know, it's like they're building like one central agent and and yeah We'll have the med AI assistant that you can use but a lot of our vision is that We want to empower all the people who use our products to basically create agents for themselves So whether that's you know, all the many many millions of creators that are on the platform or you know hundreds of millions of small businesses we eventually want to just be able to pull in all your content and very quickly stand up a business agent and Be able to interact with your customers and you know do sales and customer support and all that The one that we're that we're just starting to roll out more now is We call it AI studio and it basically is a set of tools that eventually is gonna make it so that every creator can build Sort of an AI version of themselves As sort of an agent or an assistant that that their community can interact with There's kind of a fundamental issue here where there's there's just not enough hours in the day, right?

  • It's like if you're if you're a creator you want to engage more with your community But you you're constrained on time and similarly your community wants to engage with you But it's tough.

  • I mean, there's there's just there's a limited time to do that So the next best thing is is allowing people to basically create These artifacts, right?

  • It's um, it's sort of it's an agent, but it's you train it to kind of on your material To represent you in the way that you want I think it it's it's a very kind of creative endeavor almost like a like a piece of art or content that you're putting out There and no, it's it's me very clear that it's not engaging with the creator themselves But I think it'll be another interesting way just like how creators put out content on on these Social systems to be able to have agents that do that One of the interesting use cases that we're seeing is people kind of using these agents for support This was one thing that that was a little bit surprising to me is one of the top use cases for meta AI already is People basically using it to roleplay difficult social Situations that they're gonna be in so whether it's a professional situation.

  • It's like, all right I want to ask my manager like how do I get a promotion or raise or I'm having this fight with my friend or I'm Having this difficult situation with my girlfriend Like how like how can this conversation go and basically having a like a completely?

  • judgment-free Zone where you can basically roleplay that and see how the conversation will go and and get feedback on it But a lot of people they don't just want to interact with the same kind of you know agent whether it's meta AI or chat GPT Or whatever it is that everyone else is using they want to kind of create their own things So the llamas is genuinely important We built this concept to call an AI factory AI foundry around it so that we can help everybody build You know a lot of people they they they have a desire to Build AI and it's very important for them to own the AI because once they put that into their their flywheel their data flywheel That's how their company's institutional knowledge is encoded and embedded into an AI So they can't afford to have the AI flywheel the data flywheel that experience flywheel somewhere else So and then so open source allows them to do that But they they don't really know how to turn this whole thing into an AI and so we created this thing called an AI foundry we provide the tooling we provide the expertise llama Technology we have the ability to help them turn this whole thing Into an AI service and and then when when we're done with that They take it they own it we the output of its what we call a NIM and this NIM this this neural micro and video inference microservice They just download it they take it and they run it anywhere they like including on-prem and we have a whole ecosystem of partners From OEMs that can run the NIMs to GSIs like Accenture that that we've trained and work with to create llama based NIMs and and and Pipelines and and now we're we're off helping enterprises all over the world do this.

  • I mean, it's really quite an exciting thing It's really all triggered off of the llama open sourcing the the Ray-Ban metaglass Your vision for for bringing AI into the virtual world is really interesting tell us about that Yeah, well, okay a lot to unpack in there The segment anything model that you're talking about We're actually presenting I think the next version of that here at SIGGRAPH segment anything, too And it is it now works.

  • It's faster.

  • It works with Here we go It works in video now as well, I think these are actually cattle from my ranch in Kauai By the way, these are what they're called delicious delicious delicious marks There you go Yeah, another next time we do so mark mark came over to my house and we made Philly cheesesteak together next time You're bringing the time you did I was more of a sous-chef the fun effects Will be able to be made with this and because it'll be open a lot of more serious applications across the industry, too So, you know, I mean scientists use this stuff to you know study like coral reefs and natural habitats and in kind of evolution of landscapes and things like that, but I mean it's a being able to do this in video and having a be a zero shot and be able to kind of interact with it and tell it what you want to track is It's it's it's pretty cool research.

  • I think what you're gonna end up with is It's just a whole series of different potential glasses products at different price points with different levels of technology in them So I kind of think Based on what we're seeing now with the Ray-Ban metas.

  • I would guess that displayless AI glasses at like a $300 price point are going to be a really big product that like tens of millions of people or hundreds of millions of people eventually are gonna have You're gonna have super interactive AI that you're talking to yeah visual you have visual language understanding that you just showed You have real-time translation.

  • You could talk to me in one language.

  • I hear it in another language Yeah, then the display is obviously gonna be great, too But it's gonna add a little bit of weight to the glasses and it's gonna make them more expensive So I think for there will be a lot of people who want the kind of full holographic display But there are also gonna be a lot of people for whom You know, they they want something that eventually is gonna be like really thin glasses and so you guys know when when when Zuck calls it h100 his data center of h100 There's like I think you're coming up on 600,000 and and they're we're good customers That's how you get the Jensen Q&A at SIGGRAPH It's ladies and gentlemen Mark Zuckerberg, thank you You

Mark welcome to your first SIGGRAPH How do you see the the advances of generative AI ad meta today and how do you apply it to either enhance your operations or Introduce new capabilities that you're offering with generative AI I think we're gonna quickly move into the zone where Not only is is the majority of the content that you see today on Instagram You're just recommended to you from kind of stuff that's out there in the world that matches your interests and whether or not you follow The people I think in the future a lot of this stuff is going to be Created with these tools to some of that is going to be creators using the tools to create new content Some of it I think eventually is going to be content.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it