Subtitles section Play video Print subtitles [MUSIC PLAYING] TIM NOVIKOFF: Hi. My name is Tim, and I'm the product manager for Colab. For those of you who don't already know, Colab allows you to write and execute arbitrary Python code via the browser. It's hosted Jupyter Notebooks by Google. It's integrated with your Google account, which means if you're signed into Chrome or signed into Gmail, you're already signed into Colab. And it's integrated with Google Drive, which means that when you create a Jupyter Notebook in Colab, it's automatically saved to your drive account. Colab started in 2012 as an internal tool for data analysts. In 2014, we started using it for our internal machine learning crash course. This is actually where a lot of Googlers first learned TensorFlow. In 2017, we launched it publicly. We wanted the whole world to have access to the same great tool that Googlers were using to get their work done. Colab is easy to use. When you connect with a virtual machine in a Colab notebook, your Python environment is already set up for you. The latest version of CUDA is already installed for you. It just works. So how do we do this? Well, on the back end, we maintain a pool of pre-warmed VMs. These virtual machines have hundreds of popular Python libraries pre-installed, including TensorFlow, TensorBoard, TF privacy, TF data sets, and many more. There are resource limits in Colab. And those are there to ensure sustainability and reduce abuse. Without any further ado, here are my top 10 Colab tricks for TensorFlow users. Number 10. Always specify your TensorFlow version in Colab. In Colab, when you import TensorFlow, by default today it imports a version of TensorFlow 1. But soon, it will by default import a version of TensorFlow 2. You can start using TensorFlow 2 in Colab today by specifying your TensorFlow version with the syntax shown on the screen. If you have old Colab notebooks that use TensorFlow 1 and you want those to keep working well after we switch to default TF 2, make sure to specify TF 1.x in those notebooks. That will futureproof those notebooks so they'll keep working well. The bottom line-- always specify your TensorFlow version in Colab. Number 9. Use TensorBoard. Colab output cells can render essentially arbitrary HTML. And the TensorBoard team has taken advantage of this to allow you to use TensorBoard right in line with Colab. It works great, and I highly recommend it. Number 8. TFLite? No problem. Even though TFLite is for on-device machine learning, you can still train your TFLite models in the cloud in Colab. And in fact, that's a popular way to use TFLite. Number 7. Use TPUs. It's free in Colab and very easy to do. You just go to change runtime type and select TPUs from the runtime-- from the dropdown. If you want to feel the raw power of TPUs under your fingertips, give them a whirl for free in Colab. And with TensorFlow 2.1, TPUs have never been easier to use. Number 6. Use local runtimes if you want. If you have your own workstation and your own GPUs perhaps and you want to run your workload on your own hardware, that's fine. You can still actually use the Colab UI with a local runtime. And that's easy to do as well. Number 5. The Colab scratchpad notebook. If you find yourself with Colab notebooks that have names like "untitled 17" and "untitled 18," the Cloud scratchpad notebook might be for you. This is a special notebook available at the URL shown here that's not automatically saved to your drive account. And it's really great for doing your scratch work. Number 4. Copy your data into your Colab VM. Rather than calling data on external storage or runtime when you're, for example, training your models, first copy all your data into the Colab VM. This can result in speed-ups even if you're only going to use the data once. Number 3. Mind your memory. If you've ever run out of memory in Colab, you may have noticed that perhaps later you were assigned a special high RAM runtime. Well, that may seem like a great thing. But it also means that you're more likely to run into the resource limits of Colab later. So the best thing to do is to not run out of memory at all and mind your memory when you're doing your work. Number 2. Close your tabs when you're done with Colab. This will help you disconnect from the underlying virtual machines sooner. Again, conserving resources makes it less likely that you'll run into the resource limits of Colab. Finally, the number one trick, only use GPUs when they're actually needed for your work. Now, of course, when using TensorFlow, often GPUs are needed for your work. But when you're doing work that doesn't require a GPU, just use a default runtime that only has a CPU. Again, conserving resources helps you get the most out of Colab by avoiding later running into the resource limits of Colab. Now, we do hear from users all the time things like, "We want faster GPUs. We want more reliable access to P100s. We want longer runtimes. We want more lenient idle time out periods. And we want longer maximum VM lifetimes. Or we want higher memory with more reliable access to high RAM runtimes." Well, we do hear you. And that's why we recently launched Colab Pro. Colab Pro comes with faster GPUs, longer runtimes, and more memory for $10 a month. We launched it in February, and it's been a big success. It's only available in the United States for now, but we're working as fast as we can to bring it to more countries. All right. So what's next in Colab? Well, first of all, the free version of Colab is not going away. The free version of Colab and Colab Pro will continue to co-exist, and we're continuing to work on improving both of them. And as for feature development, well, that's where I would like to hear from you. Tweet at me. Send feedback via the product. Tweet at GoogleColab. However you do it, let us know what you want us to build next in Colab, because it's user feedback from people like you that drives our roadmap going forward. So please let us know what you want next in Colab. [MUSIC PLAYING]
B1 colab runtime tflite gpus notebook version Making the most of Colab (TF Dev Summit '20) 7 0 林宜悉 posted on 2020/03/25 More Share Save Report Video vocabulary