Placeholder Image

Subtitles section Play video

  • Mark, welcome to your first SIGGRAPH!

  • Hi!

  • How do you see the advances of generative AI at Meta today, and how do you apply it to either enhance your operations or introduce new capabilities that you're offering?

  • With generative AI, I think we're going to quickly move into the zone where not only is the majority of the content that you see today on Instagram just recommended to you from stuff that's out there in the world that matches your interests and whether or not you follow the people,

  • I think in the future a lot of this stuff is going to be created with these tools too.

  • Some of that is going to be creators using the tools to create new content, some of it I think eventually is going to be content that's either created on the fly for you or kind of pulled together and synthesized through different things that are out there.

  • I kind of dream of one day you can almost imagine all of Facebook or Instagram being a single AI model that has unified all these different content types and systems together that actually have different objectives over different time frames.

  • Some of it is just showing you what's the interesting content that you want to see today, but some of it is helping you build out your network over the long term, people you may know or accounts you might want to follow.

  • These multimodal models tend to be much better at recognizing patterns, weak signals and such.

  • It's so interesting that AI has been so deep in your company.

  • You've been building GPU infrastructure, running these large recommender systems for a long time.

  • I'm a little slow on it actually, getting to GPUs.

  • I was trying to be nice.

  • I know.

  • Tell everybody about the Creator AI and AI Studio that's going to enable you to do that.

  • We've talked about it a bit, but we're rolling it out a lot wider today.

  • A lot of our vision is that I don't think there's just going to be one AI model.

  • This is something that some of the other companies in the industry, they're building one central agent.

  • We'll have the MetAI assistant that you can use, but a lot of our vision is that we want to empower all the people who use our products to basically create agents for themselves.

  • Whether that's all the many, many millions of creators that are on the platform or hundreds of millions of small businesses, we eventually want to just be able to pull in all your content and very quickly stand up a business agent and be able to interact with your customers and do sales and customer support and all that.

  • The one that we're just starting to roll out more now, we call it AI Studio.

  • It basically is a set of tools that eventually is going to make it so that every creator can build an AI version of themselves as an agent or an assistant that their community can interact with.

  • There's a fundamental issue here where there's just not enough hours in the day.

  • If you're a creator, you want to engage more with your community, but you're constrained on time.

  • Similarly, your community wants to engage with you, but it's tough.

  • There's limited time to do that.

  • The next best thing is allowing people to basically create these artifacts.

  • It's an agent, but you train it on your material to represent you in the way that you want.

  • I think it's a very creative endeavor, almost like a piece of art or content that you're putting out there.

  • It's going to be very clear that it's not engaging with the creator themselves, but I think it'll be another interesting way, just like how creators put out content on these social systems to be able to have agents that do that.

  • One of the interesting use cases that we're seeing is people using these agents for support.

  • This was one thing that was a little bit surprising to me.

  • One of the top use cases for meta-AI already is people basically using it to role-play difficult social situations.

  • Whether it's a professional situation, it's like, I want to ask my manager how do I get a promotion or raise, or I'm having this fight with my friend, or I'm having this difficult situation with my girlfriend.

  • How can this conversation go?

  • Basically having a completely judgment-free zone where you can basically role-play that and see how the conversation will go and get feedback on it.

  • A lot of people, they don't just want to interact with the same agent, whether it's meta-AI or ChatGPT or whatever it is that everyone else is using.

  • They want to create their own thing.

  • The llama is genuinely important.

  • We built this concept to call it an AI foundry around it so that we can help everybody build.

  • A lot of people, they have a desire to build AI.

  • It's very important for them to own the AI because once they put that into their flywheel, their data flywheel, that's how their company's institutional knowledge is encoded and embedded into an AI.

  • They can't afford to have that AI flywheel, that data flywheel, that experience flywheel somewhere else.

  • Open source allows them to do that, but they don't really know how to turn this whole thing into an AI.

  • We created this thing called an AI foundry.

  • We provide the tooling. We provide the expertise.

  • Llama technology, we have the ability to help them turn this whole thing into an AI service.

  • Then when we're done with that, they take it, they own it.

  • The output of it is what we call a NIM.

  • This NIM, this Neural Micro NVIDIA Inference Microservice, they just download it, they take it, and they run it anywhere they like, including on-prem.

  • We have a whole ecosystem of partners from OEMs that can run the NIMs to GSIs like Accenture that we've trained and worked with to create Llama-based NIMs and pipelines.

  • Now we're off helping enterprises all over the world do this.

  • It's really quite an exciting thing.

  • It's really all triggered off of the Llama open sourcing, the Ray-Ban Metaglass, for bringing AI into the virtual world.

  • It's really interesting. Tell us about that.

  • Yeah, well, okay, a lot to unpack in there.

  • The Segment Anything model that you're talking about, we're actually presenting, I think, the next version of that here at SIGGRAPH, Segment Anything 2.

  • It now works. It's faster. It works with...

  • Oh, here we go.

  • It works in video now as well.

  • I think these are actually cattle from my ranch in Kauai.

  • By the way, these are...

  • They're called Mark's cows.

  • Delicious Mark's cows.

  • There you go.

  • Next time we do...

  • Mark came over to my house and we made Philly cheesesteak together.

  • Next time you're bringing the cow.

  • I'd say you did. I was more of a sous-chef.

  • The fun effects will be able to be made with this and because it'll be open a lot of more serious applications across the industry too.

  • Scientists use this stuff to study coral reefs and natural habitats and evolution of landscapes and things like that.

  • Being able to do this in video and having it be a zero shot and be able to interact with it and tell it what you want to track, it's pretty cool research.

  • I think what you're going to end up with is just a whole series of different potential glasses products at different price points with different levels of technology in them.

  • I kind of think based on what we're seeing now with the Ray-Ban Metas,

  • I would guess that display-less AI glasses at a $300 price point are going to be a really big product that tens of millions of people or hundreds of millions of people eventually are going to have.

  • You're going to have super interactive AI that you're talking to.

  • You have visual language understanding that you just showed.

  • You have real-time translation.

  • You could talk to me in one language,

  • I hear it in another language.

  • The display is obviously going to be great too, but it's going to add a little bit of weight to the glasses and it's going to make them more expensive.

  • There will be a lot of people who want the full holographic display, but there are also going to be a lot of people for whom they want something that eventually is going to be really thin glasses.

  • So you guys know when Zuck calls it H100, his data center of H100s,

  • I think you're coming up on 600,000.

  • We're good customers.

  • That's how you get the Jensen Q&A at SIGGRAPH.

  • Ladies and gentlemen, Mark Zuckerberg.

  • Thank you.

Mark, welcome to your first SIGGRAPH!

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it