Subtitles section Play video Print subtitles [MUSIC PLAYING] [VIDEO PLAYBACK] - I'll tell people, I work at Google. And they go, what do you work on? I design Search. And they kind of pause for a second and they're like, what is there to design? [MUSIC PLAYING] - Oh, I know. It's cat. [BEEP] - Hi. How can I help? [MUSIC PLAYING] [END PLAYBACK] BRENDA FOGG: Hello, everybody. I'm Brenda Fogg. I work in the Creative Lab, which is sort of a multi-disciplinary creative group within Google that often collaborates with other teams across the company on product or technology experiments. And sometimes, finding new ways to talk about some of the innovative work that's going on inside of Google. And at some point over the years, I've worked with each of our panelists on some of those projects that you probably saw on the video just now. So we're going to talk a little bit about some things that you may have seen. A little bit about design at Google and what that means as sort of a fundamental framework and connective tissue between the things that Google makes and making them as useful and accessible to as many people as possible. So let's start with some introductions. We have Doug Eck right here to my left, who leads a project called Magenta, which is all about exploring machine learning and creativity. We have Isabelle Olsson, who's responsible for the design of the Google Home and wearables and hardware. And over there is Ryan Germick, Principal Designer, who also is known as the Doodle guy. And has been quite involved in the Google Assistant. And generally, just in the business of delighting users everywhere. So let's start by letting everybody talk a little bit about what you do at Google. And we'll start with Doug, because you're sitting next to me. You head up this project called Magenta. For anybody who doesn't know exactly what that is, maybe you can talk a little bit about that. And maybe touch on what inspired the group and the focus of the team. DOUGLAS ECK: OK, sure. So yeah, I lead a project called Magenta. The M-A-G in Magenta stands for Music and Art Generation. And we started by trying to understand the capacity to use AI, specifically machine learning, to generate music and art. And it took us about a month to realize that that's asking the wrong question. Because if all you're doing is trying to generate music and art, then you just keep pushing this button and the machine learning keeps making music and art for you. And it gets boring fast. So we pivoted very quickly to talking about, how can we use machine learning to enable artists and enable musicians to make something new? To make something different? So it's sitting on the same idea of technology and art interacting with one another in a virtuous way, starting with cave drawings and moving forward through the film camera and other bits of technology. So yeah, I'm about AI and art and music. BRENDA FOGG: So you're not necessarily trying to replace creativity or duplicate creativity, but more providing the tools to enable people to do that? DOUGLAS ECK: Yeah. And I think it's not just because that's what we choose to focus on. I think creativity is fundamentally human. And it's about communication. So if we take the communication loop out, we can imagine in some analytical way, a computer generating new things. But what makes creativity work is how we respond to it and how we then feedback in to that process. So I think it's very much a societal communicative act. BRENDA FOGG: And that idea of creating new things, like maybe some things that weren't necessarily possible before. That weren't humanly possible to create. So there's an-- I don't know if you want to talk about this, but this example that was shown in the lead-up into the keynote yesterday. The NSynth Project, which is one of those things that can sort of augment what human creativity can do. You want to touch on that? DOUGLAS ECK: Yeah, sure. For NSynth and the following hardware device, NSynth Super, you may have seen was played on stage before the keynote and discussed there. I think the main idea there is, can we use machine learning to allow us to generate new sounds? And sounds that are musically meaningful to us? And one thing to point out is that we already have ways to do that. There's a bunch of great software. I have a piano at my house, so I can-- and a guitar. There are lots of ways to make sounds. What we hope we can get is some kind of expressive edge with AI. Something that we can do with these models. A kind of intuitiveness or a kind of new kind of mobility artistically by having a new tool. And the one thing I would say, I don't want to take up too much time because there's a lot of other great people here on stage. But I really like to think about the film camera. The film camera was initially not treated as an artistic device. It was treated as something to capture reality. And it was transformed into an artistic device by artists, by photographers. And our hope on Magenta is that we find the right artists and the right musicians to take what we're doing and turn it into something creative. BRENDA FOGG: So turning something into creative, Isabelle. When you're designing a product, the Home for example, you're trying to create something that appeals to everyone through its design. But everybody's different, right? So these are physical products that share a physical space with the people that use them. And like sometimes, you have to cohabitate. So talk a little bit about how you approach that problem. ISABELLE OLSSON: Yeah. I mean, I think I have the utmost respect for people's homes. And like you said, they're all different. And I think next to your body, your home is your most intimate space. And it's the place for you share with your loved ones and your family. So to enter that space with our products, we have to be super-thoughtful about what we do there. And I think for us, the most important thing is to be inspired by the context in which our products live in. So when we were designing Google Home Mini for example, the goal was to design an Assistant for every room. And that means your bedside table. And your bedside table, that's where you put devices that help you see better. Or like, a book that helps you dream. So that space is just so special. We wanted to create something that was beautiful, that fit into the home, and didn't take up too much attention, and kind of faded a little bit into the background. BRENDA FOGG: And you're also responsible for CMF at Google, which is Color Material Finish, right? ISABELLE OLSSON: Yes. BRENDA FOGG: And I've heard this story about testing, like 150 different versions of color palettes for the Mini. Is that right? ISABELLE OLSSON: Yeah. I mean, I think for us, developing the color palette for the Google family of products and individual products is a very-- it's a combination of art and science, I would say. And we start usually two to three years before the products come out. So we have to do a lot of anticipation of where society is going, where trends are going. And take all of those kind of inputs into account to make sure that when we release a product, it makes sense to people. In addition to that, of course when you design for the home, you have to think about the fact that there is going to be light hitting the product. How does it stand the test of time? We want to make sure the products look beautiful for a long time. So we have to go through a lot of iteration to get it right. And then also, especially as we're developing fabrics for example, depending on where you put it, it takes different-- it looks different in different lighting conditions. So when we designed Mini, we went through, I think 150 iterations of just the gray color. But it was a lot of fun. And it was about finding that right balance with, what is too light? What's too dark? And the other day, I got this lovely email by someone on the team who had picked out his couch to match Google Home Max. So I took that as a giant compliment because we were trying to do it the other way around. But that was a beautiful story. BRENDA FOGG: What is the intersection of the intuition that you use as a designer when you approach these kinds of problems with the sort of iterative testing? And sort of the scientific materials examination? ISABELLE OLSSON: Yeah, it's a hodgepodge. The process is not linear. It's pretty messy, usually. But we have fun with it. I think the key is gather as much input as possible, and then digest it. And then, come up with prototypes and ways of relating to how this will fit into people's homes. So even right next to my desk, I have a big bookshelf that we place random objects from all over the world for inspiration. But also, to kind of put our stuff there quickly to see, how does it feel? And how does it feel over time? Because it's not only about creating something that you are first attracted to, but it has to be things that you can live with for a long time. BRENDA FOGG: So Ryan, you lead the Google Doodles team. And this team is unique in a lot of ways. Namely, one of them is that you regularly and willfully break the brand rules. RYAN GERMICK: Gleefully, yeah. BRENDA FOGG: Gleefully for Google. In fact, on a daily basis, many, many times. And that's unusual because it's the core of the brand. And that's something that seems to keep working and working and working over the years. So talk a little bit about why you think it's important to have the ability to kind of just mess with it. RYAN GERMICK: Sure. I mean, Google's mission is to organize the world's information and make it universally accessible and useful. And I believe in that mission. I think it's a very powerful, good thing to do for the world. And we hone in on the idea of making things accessible by creating an emotional connection with users. And sometimes, like mucking up the standards. That's like collateral damage for people getting a positive charge and learning something new or having fun with something, then we think it's worthwhile. And yeah, I think on a human level, there are things that are more important than consistency. It's, for us, more about using our creativity and craft to make people feel welcome in the space of technology. BRENDA FOGG: Yeah. So making people feel welcome in the space of technology. You also lead the team who created the personality for the Google Assistant. RYAN GERMICK: Right. BRENDA FOGG: And how do you create a personality? I mean, there's sort of the transactional things that have to happen between a user when they're interacting with a digital assistant. And they have an expectation that they're going to be delivered the information that they asked for. And you felt like it needed to go a little bit farther than that sort of transactional relationship. But people have-- a little bit the way we were talking with Isabella, everyone has different things that they like to interact with. And some people like small talk and some people don't. And some people think things are funny that other people think are totally not funny at all. Talk a little about that. RYAN GERMICK: Yeah. I mean, I think that as Isabella mentioned, that technology, like the assistant that comes in a smart speaker, or a smart display, or in your phone, is really personal. That's one thing. And so we recognize that we have a different set of design challenges than if it was more objective, like a Google Search engine [INAUDIBLE]. And then also, when you invite this technology into your life-- we're using this conversational interface as a metaphor. You can talk to it and it can respond to you. And as soon as you hear the human voice, it not only opens up an opportunity to have a character, but what we've seen, it's almost like an obligation to design for the character. Because if you don't design for it, then people will just assume you don't have much of a character. But there's still some implicit character. So we took the learnings that we had from Doodles, and being an implicit character for Google where we celebrate certain things and we get creative and nerdy and excited. And we tried to transfer that over to the Google Assistant, where it could be like a character that you'd want to spend time with because it has things that it gets excited about. Or it has a perspective where it really wants to help you. And not just be something that you want to use, but something that you want to spend time with. So yeah, surprising number of the principles are things that we did for Doodles were applicable for the Google Assistant. And it's a huge project. And there's a lot of pieces to the puzzle, but we think it's an important part of the user experience to have a sense of who the character is. BRENDA FOGG: Yeah. So each of you have talked a little bit about how technology interacts with humans and vice-versa. And how those two things have to kind of co-exist. So good design and thoughtful design is a means to make technology, in this case, more approachable and useful and usable. And friendly. And to make people comfortable with that. And you all approach your work and problem-solving in this way from a very human perspective, right? A very like-- you inject empathy. We're going to get real and talk about humanity and empathy, right? Injecting this empathy into your process. So let's talk about that. Doug, do you think-- can the work you do with machine learning allow a machine to express art in a human way? Let's start there. DOUGLAS ECK: Start there. Yes, with some constraints on how this all works. I think what we realized early was that we need at least two players in this game, so to speak. Part of the work is building new technology. So in some sense, we're taking on the role that a luthier might take on in building a guitar. Or that someone doing a music tech program might take on in building a new kind of electronic instrument. And I think there's a thought process that goes with building something like that that is very creative. But I think you're also in some very real way constrained by the act of building the thing. To understand it in a certain way. It's your baby. You built it. And so you know-- you wrote the operating manual, so you know what this thing is supposed to do. And in most cases, what we see is that for something to become a truly expressive, artistic device, it has to in some very real way be broken by someone else. And I think it's almost impossible for us as the builders of that device to also be the ones that break it. And so our dream in Magenta is to connect with artists and musicians and people. People that don't know how to code. People that don't necessarily even care much about computation and draw them into this conversation. And so what we found is that we started by releasing machine learning models in open source on GitHub as part of TensorFlow with instructions like, please run this 15-line-long Python command. It's going to be great. Just run this command and hit Enter. And then, just wait because you're going to get 100 MIDI files in a temp directory somewhere on your machine, right? Everybody's like, that's not how I make music. So what we've seen is that part of our work, even on the technologist side, even as luthiers, so to speak-- guitar makers-- part of our job is to do good design. And to build interfaces that people can use. And then, hopefully the interfaces are flexible enough and expressive enough that in some very meaningful way, people can also do some fun breakage. And getting there requires a lot of moving parts. A large component of which is very good design. BRENDA FOGG: I like that notion of breaking things. You told a story once about-- or you made an analogy once about the electric guitar, I think, and how that's a little bit similar. Like the dissonance that people create with electric guitars is not-- DOUGLAS ECK: Yeah. That's right. So first, I tell the same stories. I'm like the grandpa-- BRENDA FOGG: Well, I don't know if you tell it to everybody. DOUGLAS ECK: Please, tell us that story again. No, but it's true. The electric guitar was invented to be a loud acoustic guitar. To overcome noise on stage. And the worst-- they were trying really hard to not have these amplifiers distort. So imagine a world where amplifiers don't distort and electric guitars sound like acoustic guitars. You actually haven't moved very far. And the breakage there was actually having fun with the distortion. And actually, going for sounds that aren't like an acoustic guitar. BRENDA FOGG: Let's go back to Isabelle. One of the things I think that's so interesting about your work, and that people have to co-habitate with these physical things, is that it's just as important, or maybe more important, how people feel about these things rather than just what their utility is. What kind of considerations do you make for-- we're starting to sound like hippies now. But like, what people's feelings and their empathies and the way they co-exist in the space with these things? ISABELLE OLSSON: I think a good tool that I use a lot is that I put stuff in front of people and ask them, what do you think it looks like? It's a fun game. You don't always get back what you want to hear. But it's a really good way of testing if the object you've created, does it have positive connotations or negative connotations? The first time I showed a prototype of Mini, I showed it to a French person who said, it's like a macaron. And I thought that was amazing because, first of all, I love macarons. And then, I think having something connotate something sweet and delicious is just excellent. And again, we surround ourselves with food. That was just-- I knew we were onto something there. And food is something universally appealing, generally. So that's one exercise out of many. I think the key is just to really make the thing real really quickly. To translate the big idea into something tangible, and then ourselves living with it for a while, too. And then also, think about not only the food analogies, but also making sure that the objects we design are understandable. You understand what it is. So again with Mini, we wanted it to look a little bit like a speaker and a little bit like a microphone, but not too much of either. But be very honest to that function. And then, connotate that this goes in the home. And therefore, the fabrics and the things that we use to surround ourselves with. BRENDA FOGG: Yeah. It has to have that human touch to it as part of the design process. ISABELLE OLSSON: And the beauty of it is when you find these solutions, a lot of the times, they enhance the function or help with the function. Like fabric is this excellent material that is most of the time audio transparent. You can have lights through it. You can kind of create this calmness in the object itself by getting all the functionality through it. And I'm really passionate about trying to design pieces of technology, which hopefully people think about they're just stuff and not as technology, but that can live out in the open. There's just way too many pieces of furniture that are purely designed to hide technology. So my goal in life is if we could get rid of those things. BRENDA FOGG: And Ryan, that sort of human touch is pretty evident in most everything that you do. So if we can talk about the Google Assistant again. It was designed to operate and to be used through the power of conversation, which is a fundamental human interface, I guess. And through the course of your work on creating a personality, talk a little bit about how you sort of steered through the landmines of what kinds of-- aside from the transactional things, what kinds of things are people going to want to talk about with their Assistant? RYAN GERMICK: Yeah. I mean, I think this may be a bit cliche, but it's so early days. So I think we're still steering. But for us, a guiding principle for our success is, is a feeling thing. Does this feel like a character you want to spend time with, like I mentioned earlier. As far as like finding things that people-- we wanted to steer clear of. I mean, it was really interesting to look at the different queries that people ask Google Search, and then what people ask Google Assistant. And at Google, as you might imagine, there's a lot of people that have a background in like information retrieval. And like, data ranking, things like search ranking, things like that. And it kind of turned things on their head when now people are asking questions like, what's your favorite flavor of ice cream? Or like, did you fart? And those are like pretty-- more common than you think when people first get a piece of technology that's been lovingly crafted. They all of a sudden have a very different relationship to it. A very sizable number of the queries that we get on the Google Assistant are like, first-date queries. Like, do you have any brothers or sisters? It's really sweet. DOUGLAS ECK: Ryan, you're the only person that can-- what is her favorite flavor of ice cream? I'm sure everybody wants to know. RYAN GERMICK: This is a very illuminating question, Doug. Thank you for asking. So basically, we have a principle. And this speaks to Brenda's question, too. We basically set up principles where we-- for example, we have one principle that we want to talk like a human. We want to take advantage of the human voice and the interface. But we don't want to pretend to be one. So if you were to ask a question like, what's your favorite flavor of ice cream, we would do what we would call an artful dodge. And we look to our training in improv theater where we don't want to deny the user of like, I do not eat ice cream. I do not have a mouth. That's like a really bummer answer. If you're exploring a new technology, that's a shut down to the conversation. But at the same time, we don't want to lie and say like, well, salted caramel, obviously. This is like a position that is disingenuous because it does not eat ice cream. So we would say something like, you can't go wrong with Neapolitan. There's something in it for everyone. And we would take that question, understand that the subtext is like, I'm getting to know what you are and what your capabilities are. And we would-- yes, and. And we would kind of continue to play the game. And use it actually as an opportunity to make a value statement that we're inclusive. And we want everybody to know-- we want to reflect that ice cream that is good for everyone is good. BRENDA FOGG: How much dialog goes on within your team when you're trying to-- when you're talking about, OK. What if someone asks the Google Assistant, do you fart? RYAN GERMICK: Yeah. As soon as that question I knew was going to be answered. And it wasn't just going to default to an answer, I knew that we already won. The humanist amongst us already won. Because there was a school of thought that you would say, I don't fart. I don't have a body. And that was like, end of story. And that just seems-- that's true, but kind of not in line with keeping the game going. So we would have a lot of back and forth. And we would then like, take that answer. And we'd say, well, at least you could say, I don't have a butt. Because at least then you'd be a little more specific. BRENDA FOGG: Start there. No butt. RYAN GERMICK: But in our case, we ended up with something a little more playful and little more addressing the subtext, which is of the school of like, whoever smelt it, dealt it. Which we said, you can blame me if you want. I don't mind. If the user's asking about that, let's just take it one step further and put them on the spot. BRENDA FOGG: Are you all going to go ask your Google Assistant now? RYAN GERMICK: I think there's like 25 different answers because that is definitely a key use case for-- BRENDA FOGG: Keep asking. Just keep asking. OK. So let's talk a little bit about how all of this humanity plays out in the context of a brand, like Google. So Isabelle, the Home Mini, you mentioned, needed to be both a speaker and a microphone as well as an assistant and behave like an assistant. So if you're starting from those kind of very engineering kind of product requirements, how do you go from there into the idea of personality of a brand? In Ryan's case, his work talks. The personality comes through that way. In your work, it comes through sort of the materials and the things. How do you consider the personality of the Google brand in the work that you do? ISABELLE OLSSON: Yeah. I mean, I think it's a huge responsibility. And we're only a few years into making hardware that people actually put down money for. And you know, the brand is just really incredible. So we're trying to figure out, what's core to Google? And how do we translate that into physical form? And sometimes, it's not about a direct translation. Because most people don't want to pay money for something quirky, maybe. So taking that kind of principle and that idea, and then thinking about what it means for hardware. So in this case for example, to me, Google stands for a sense of optimism. And kind of this optimistic outlook on the future. So if I can do things that remind people of that or that makes people smile, I think that that naturally then feels like a Google product. So just one simple example of that is you turn Mini upside down, there is a pop of color on the back. And only you as a person who bought the product know that. But you know it kind of has that Google on the inside. BRENDA FOGG: Yeah. Let's go back to Ryan then. Because over the years, over seven or eight years, or whatever. However many years you've been-- DOUGLAS ECK: 12 almost. BRENDA FOGG: 12 years. You've had a lot of opportunities to craft those sort of moments of delight and those sort of little user experiences that are like turning over the Mini and finding a little surprise. So everything from-- you're responsible for the Pegman, which is the character that you drop into Google Maps when you go into Street View. And we talked about the personality of the Google Assistant a little bit. And then of course, the Doodles taking over the Home page. So over the 12 years that you've been kind of working in that territory, and as the Google brand has grown and evolved, have you found-- how has that growth of the brand impacted the work that you do? RYAN GERMICK: I think the core of what I try to do, I almost discovered it by accident. Like the Street View Pegman, maybe is a story for another day. But I was just glad that I worked in a place that had free strawberries when I got here. That was very exciting to me. And then, that they paid me to draw and be creative was just beyond my wildest dream. So I'm just like, happy to be here. Still, I'm happy to be here. But what kind of worked for me because it was always sort of my MO was, how can I use my position of privilege to bring other people up and to give them a sense of belonging? And that has stayed consistent. So whether it's trying to make sure we have inclusive Doodles or creating an opportunity for a little, like mannequin that can be dressed up for holidays or whatever, for Street View. There's been a through line where maybe in the beginning, Google was more of an underdog. And now, Google is like a very important part of people's lives. I don't think you could really say it's a small organization, by any stretch. But there are still human touch points that matter to make people feel like they belong, which is what Google's trying to do for everyone. BRENDA FOGG: I want to make sure we leave time for questions, if anybody has them. So if you have questions, you could start coming to the microphones while we kind of go a little bit into the future. Let's talk about the future. So if we're sitting here a year from now or a few years from now, Doug, what do you expect that machine learning might do for art in the future? Whether it's your aspirations for the next 12 or 18 months, or maybe 5 years from now. DOUGLAS ECK: So I think the really interesting way to think about this is consider generative models as a family of machine learning models. Models that generate new instances of the data upon which they're trained. Where I see us going is actually very heavily integrating the design process of products with generative models. We're seeing machine learning generating part of what we're trying to do. And I think that's going to touch our lives in the arts and music and communication in a number of ways. And to those of you who are-- anybody in the room a developer? It's an easy question because we're at a developer conference, right? So we're going to have a responsibility as machine learning experts to understand a little bit about design. A responsibility as back-end engineers to understand a little bit about machine learning and design. I think we're going to see much more of a need for end-to-end integration. For me, the future started happening already in a sense. I have teenage kids, and I watched just how they use Snapchat to communicate. And how they've built their own kind of grammar around it. And it's a very, very simple product. Now, imagine 10 years of advances in assistive writing. So you know, you're using Google Docs and you're writing. And you have some machine learning algorithm helping you communicate, right? We're going to get very good at this very fast. And I expect that when my kids were younger, the teachers were all worried if they used Wikipedia too much to write their papers. And now it's going to be like, wait, how much of this did you actually write? What part of it did you write? And what part of it did your Assistant write? And I think we can be dystopian about that. There are potentially some very difficult issues here. But it's also wonderful, I think. As long as we use this to communicate more effectively and in different ways, and we make it into something creative, I think it's very exciting to think about how machine learning can become really more deeply integrated in the process of communicating. And again, that's what I see the arts as being about and music being about. It's about communicating. It's about sharing our thoughts, our feelings, our beliefs with each other. And I'm seeing in my career that happening more deeply with machine learning as well. So that's my future vision. BRENDA FOGG: I love it. I love your vision. Isabelle, what about hardware? What do you what do you want to see in hardware in the next year or two? ISABELLE OLSSON: Well, number 1, I hope people find out about it. So we just did a small exhibition in Milan a couple of weeks ago. And part of the exhibition was the portfolio that we launched last year. A lot of people will come up to me and say, these concepts are great. And I'm like, they're not concepts. They're actual products. So I think it was a little bit of that. And then, I hope we just continue to design for everyone in everyday life. BRENDA FOGG: And Ryan, what would you like to-- what would you say? What would you like people to take away today? RYAN GERMICK: I think just remember that technology is for people, first and foremost. So just always keep that question in the back of your mind of like, how is what I'm doing helping people? BRENDA FOGG: OK. Do we have questions? [MUSIC PLAYING]
A2 US brenda isabelle machine learning ryan assistant design Design, machine learning, and creativity (Google I/O '18) 401 29 MisoHong posted on 2018/06/02 More Share Save Report Video vocabulary