The Science of Information: From Language to Black Holes

Watch The Science of Information: From Language to Black Holes

  • 2015
  • 1 Season

The Science of Information: From Language to Black Holes is an engaging and informative course offered by The Great Courses Signature Collection. The course is presented by Professor Benjamin Schumacher, an esteemed physicist and information theorist, who provides a fascinating exploration of the nature of information and its fundamental role in our universe.

The course is organized into 24 lectures that cover a wide range of topics related to the science of information. Professor Schumacher begins by introducing the concept of information and how it has evolved from analogue to digital formats. He then delves into the mathematical principles of information theory and explores how they are used to measure and transmit information. The course covers a wide range of topics, from the physics of black holes to the biology of the brain, all united by the role that information plays in each.

One of the main themes of the course is the idea that information is a fundamental aspect of the universe. Professor Schumacher provides many examples of how information is embedded in the fabric of our reality including the communication between neurons in the brain and the encoding of genetic information within DNA. The course also explores how information theory has influenced fields such as cryptography, computer science and even anthropology.

The course also delves into topics related to language and communication, including the origins of language and the structure of language itself. Professor Schumacher describes how the principles of information theory have been used to create algorithms that can translate between different languages, and how these algorithms have evolved to be more sophisticated over time. The lectures on language provide a fascinating look at how information is used in day-to-day communication.

Another topic that the course covers in great detail is the physics of black holes. Professor Schumacher provides an accessible explanation of the principles of general relativity and how they relate to the formation and properties of black holes. He also explores the role that black holes play in the transmission of information in the universe.

Throughout the course, Professor Schumacher provides many examples and analogies to help the viewer understand the complex concepts being presented. For example, he compares the transmission of digital information to a language and explains how the principles of information theory can be used to create error-correcting codes that reduce the chance of data loss or corruption.

Overall, The Science of Information: From Language to Black Holes is an engaging and enlightening course that provides a deep dive into the fundamental role that information plays in the universe. Professor Schumacher’s presentation is clear and accessible, making even the most complex concepts understandable to viewers without a scientific background. Whether you are a scientist or just someone curious about the nature of the universe, this course is sure to provide many insights and memorable moments.

The Science of Information: From Language to Black Holes is a series that is currently running and has 1 seasons (24 episodes). The series first aired on December 11, 2015.

Filter by Source

Seasons
The Meaning of Information
24. The Meaning of Information
December 11, 2015
Survey the phenomenon of information from pre-history to the projected far future, focusing on the special problem of anti-cryptography-designing an understandable message for future humans or alien civilizations. Close by revisiting Shannon's original definition of information and ask, "What does the theory of information leave out?"
It from Bit: Physics from Information
23. It from Bit: Physics from Information
December 11, 2015
Physicist John A. Wheeler's phrase "It from bit" makes a profound point about the connection between reality and information. Follow this idea into a black hole to investigate the status of information in a place of unlimited density. Also explore the information content of the entire universe!
Quantum Cryptography via Entanglement
22. Quantum Cryptography via Entanglement
December 11, 2015
Learn how a feature of the quantum world called entanglement is the key to an unbreakable code. Review the counterintuitive rules of entanglement. Then play a game based on The Newlywed Game that illustrates the monogamy of entanglement. This is the principle underlying quantum cryptography.
Qubits and Quantum Information
21. Qubits and Quantum Information
December 11, 2015
Enter the quantum realm to see how this revolutionary branch of physics is transforming the science of information. Begin with the double-slit experiment, which pinpoints the bizarre behavior that makes quantum information so different. Work your way toward a concept that seems positively magical: the quantum computer.
Uncomputable Functions and Incompleteness
20. Uncomputable Functions and Incompleteness
December 11, 2015
Algorithmic information is plagued by a strange impossibility that shakes the very foundations of logic and mathematics. Investigate this drama in four acts, starting with a famous conundrum called the Berry Paradox and including Turing's surprising proof that no single computer program can determine whether other programs will ever halt.
Turing Machines and Algorithmic Information
19. Turing Machines and Algorithmic Information
December 11, 2015
Contrast Shannon's code- and communication-based approach to information with a new, algorithmic way of thinking about the problem in terms of descriptions and computations. See how this idea relates to Alan Turing's theoretical universal computing machine, which underlies the operation of all digital computers.
Horse Races and Stock Markets
18. Horse Races and Stock Markets
December 11, 2015
One of Claude Shannon's colleagues at Bell Labs was the brilliant scientist and brash Texan John Kelly. Explore Kelly's insight that information is the advantage we have in betting on possible alternatives. Apply his celebrated log-optimal strategy to horse racing and stock trading.
Erasure Cost and Reversible Computing
17. Erasure Cost and Reversible Computing
December 11, 2015
Maxwell's demon has startling implications for the push toward ever-faster computers. Probe the connection between the second law of thermodynamics and the erasure of information, which turns out to be a practical barrier to computer processing speed. Learn how computer scientists deal with the demon.
Entropy and Microstate Information
16. Entropy and Microstate Information
December 11, 2015
Return to the concept of entropy, tracing its origin to thermodynamics, the branch of science dealing with heat. Discover that here the laws of nature and information meet. Understand the influential second law of thermodynamics, and conduct a famous thought experiment called Maxwell's demon.
Neural Codes in the Brain
15. Neural Codes in the Brain
December 11, 2015
Study the workings of our innermost information system: the brain. Take both top-down and bottom-up approaches, focusing on the world of perception, experience, and external behavior on the one hand versus the intricacies of neuron activity on the other. Then estimate the total information capacity of the brain.
Life's Origins and DNA Computing
14. Life's Origins and DNA Computing
December 11, 2015
DNA, RNA, and the protein molecules they assemble are so interdependent that it's hard to picture how life got started in the first place. Survey a selection of intriguing theories, including the view that genetic information in living cells results from eons of natural computation.
What Genetic Information Can Do
13. What Genetic Information Can Do
December 11, 2015
Learn how DNA and RNA serve as the digital medium for genetic information. Also see how shared features of different life forms allow us to trace our origins back to an organism known as LUCA-the last universal common ancestor-which lived 3.5 to 4 billion years ago.
Unbreakable Codes and Public Keys
12. Unbreakable Codes and Public Keys
December 11, 2015
The one-time pad may be in principle unbreakable, but consider the common mistakes that make this code system vulnerable. Focus on the Venona project that deciphered Soviet intelligence messages encrypted with one-time pads. Close with the mathematics behind public key cryptography, which makes modern transactions secure-for now.
Cryptanalysis and Unraveling the Enigma
11. Cryptanalysis and Unraveling the Enigma
December 11, 2015
Unravel the analysis that broke the super-secure Enigma code system used by the Germans during World War II. Led by British mathematician Alan Turing, the code breakers had to repeat their feat every day throughout the war. Also examine Claude Shannon's revolutionary views on the nature of secrecy.
Cryptography and Key Entropy
10. Cryptography and Key Entropy
December 11, 2015
The science of information is also the science of secrets. Investigate the history of cryptography starting with the simple cipher used by Julius Caesar. See how entropy is a useful measure of the security of an encryption key, and follow the deciphering strategies that cracked early codes.
Signals and Bandwidth
9. Signals and Bandwidth
December 11, 2015
Twelve billion miles from Earth, the Voyager spacecraft is sending back data with just a 20-watt transmitter. Make sense of this amazing feat by delving into the details of the Nyquist-Shannon sampling theorem, signal-to-noise ratio, and bandwidth-concepts that apply to many types of communication.
Error-Correcting Codes
8. Error-Correcting Codes
December 11, 2015
Dig into different techniques for error correction. Start with a game called word golf, which demonstrates the perils of mistaking one letter for another and how to guard against it. Then graduate to approaches used for correcting errors in computer operating systems, CDs, and data transmissions from the Voyager spacecraft.
Noise and Channel Capacity
7. Noise and Channel Capacity
December 11, 2015
One of the key issues in information theory is noise: the message received may not convey everything about the message sent. Discover Shannon's second fundamental theorem, which proves that error correction is possible and can be built into a message with only a modest slowdown in transmission rate.
Encoding Images and Sounds
6. Encoding Images and Sounds
December 11, 2015
Learn how some data can be compressed beyond the minimum amount of information required by the entropy of the source. Typically used for images, music, and video, these techniques drastically reduce the size of a file without significant loss of quality. See how this works in the MP3, JPEG, and MPEG formats.
Data Compression and Prefix-Free Codes
5. Data Compression and Prefix-Free Codes
December 11, 2015
Probe the link between entropy and coding. In the process, encounter Shannon's first fundamental theorem, which specifies how far information can be squeezed in a binary code, serving as the basis for data compression. See how this works with a text such as Conan Doyle's The Return of Sherlock Holmes.
Entropy and the Average Surprise
4. Entropy and the Average Surprise
December 11, 2015
Intuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples, discover that entropy provides a measure of the average surprise.
Measuring Information
3. Measuring Information
December 11, 2015
How is information measured and how is it encoded most efficiently? Get acquainted with a subtle but powerful quantity that is vital to the science of information: entropy. Measuring information in terms of entropy sheds light on everything from password security to efficient binary codes to how to design a good guessing game.
Computation and Logic Gates
2. Computation and Logic Gates
December 11, 2015
Accompany the young Claude Shannon to the Massachusetts Institute of Technology, where in 1937 he submitted a master's thesis proving that Boolean algebra could be used to simplify the unwieldy analog computing devices of the day. Drawing on Shannon's ideas, learn how to design a simple electronic circuit that performs basic mathematical calculations.
The Transformability of Information
1. The Transformability of Information
December 11, 2015
What is information? Explore the surprising answer of American mathematician Claude Shannon, who concluded that information is the ability to distinguish reliably among possible alternatives. Consider why this idea was so revolutionary, and see how it led to the concept of the bit-the basic unit of information.
Description
Where to Watch The Science of Information: From Language to Black Holes
The Science of Information: From Language to Black Holes is available for streaming on the The Great Courses Signature Collection website, both individual episodes and full seasons. You can also watch The Science of Information: From Language to Black Holes on demand at Amazon Prime and Amazon.
  • Premiere Date
    December 11, 2015