Subtitles section Play video Print subtitles Where did quantum theory come from? It started, not as a crazy idea, but with a light bulb. In the early 1890s, the German Bureau of Standards asked Max Planck how to make light bulbs more efficient so that they would give out the maximum light for the least electrical power. The first task Planck faced was to predict how much light a hot filament gives off. He knew that light consists of electromagnetic waves, with different colors of light carried by different frequency waves. The problem was to ensure that as much light as possible was given off by visible waves rather than ultraviolet or infrared. He tried to work out how much light of each color a hot object emits, but his predictions based on electromagnetic theory kept disagreeing with experiments. Instead, in what he later called an “act of despair,” he threw the existing theory out the window and worked backwards from experimental measurements. The data pointed him to a new rule of physics: light waves carry energy only in packets, with high frequency light consisting of large packets of energy and low frequency light consisting of small packets of energy. The idea that light comes in packets, or "quanta", may sound crazy, and it was at the time, but Einstein soon related it to a much more familiar problem: sharing. If you want to make a kid happy... give them a cookie! But if there are two kids, and you only have one cookie, you'll only be able to cheer them up half as much. And if there are four, or eight, or sixteen hundred thousand, you're not going to make them very happy at all if they have to share one cookie between them. In fact, if you have a room with infinitely many kids but not infinitely many cookies, if you share the cookies evenly each kid will only get an infinitesimally small crumb, and none of them will be cheered up. And they'll still eat all your cookies. The difference between light waves and kids is that you can't actually have infinitely many kids in a room. But because light waves come in all sizes, you can have arbitrarily small light waves, so you can fit infinitely many into a room. And then the light waves would consume all your cookies… I mean, energy. In fact, all these infinitesimal waves together would have an infinite capacity to absorb energy, and they'd suck all the heat out of anything you put into the room… instantly freezing the tea in your cup, or the sun, or even a supernova. Luckily, the universe doesn't work that way… because, as Planck guessed, the tiny, high frequency waves can only carry away energy in huge packets. They're like fussy kids who'll only accept exactly thirty-seven cookies, or a hundred and sixty-two thousand cookies, no more and no less. Because they're so picky, the fussy high frequency waves lose out and most of the energy is carried away in lower-frequency packets that are willing to take an equal share. This common, average energy that the packets carry, is in fact what we mean by "temperature." So a higher temperature just means higher average energy, and thus by Planck's rule, a higher frequency of light emitted. That's why as an object gets hotter it glows first infrared, then red, yellow, white; hotter and hotter towards blue, violet, ultraviolet… and so on. Specifically, Planck's quantum theory of fussy light tells us that light bulb filaments should be heated to a temperature of about 3200 Kelvin to ensure that most of the energy is emitted as visible waves - much hotter, and we'd start tanning from the ultraviolet light. Actually, quantum physics has been staring us in the face since long before lightbulbs and tanning beds: human beings have been making fires for millennia, with the color of the flames spelling out "quantum" all along.
B1 frequency planck energy quantum hotter fussy The Origin of Quantum Mechanics (feat. Neil Turok) 48 11 Matt posted on 2013/09/19 More Share Save Report Video vocabulary