Description
|
Claude Elwood Shannon was born on April 30, 1916 in Petoskey, Michigan. After attending primary and secondary school in his neighboring hometown of Gaylord, he earned bachelors degrees in both electrical engineering and mathematics from the University of Michigan. After graduation, Shannon moved to the Massachusetts Institute of Technology (MIT) to pursue his graduate studies. While at M.I.T., he worked with Dr. Vannevar Bush on one of the early calculating machines, the "differential analyzer," which used a precisely honed system of shafts, gears, wheels and disks to solve equations in calculus. Though analog computers like this turned out to be little more than footnotes in the history of the computer, Dr. Shannon quickly made his mark with digital electronics, a considerably more influential idea. In a prize-winning masters thesis completed in the Department of Mathematics, Shannon proposed a method for applying a mathematical form of logic called Boolean algebra to the design of relay switching circuits. This innovation, credited as the advance that transformed circuit design “from an art to a science,” remains the basis for circuit and chip design to this day. Shannon received both a master's degree in electrical engineering and his Ph.D. in mathematics from M.I.T. in 1940.
In 1941, Shannon took a position at Bell Labs, where he had spent several prior summers. His war-time work on secret communication systems was used to build the system over which Roosevelt and Churchill communicated during the war. When his results were finally de-classified and published in 1949, they revolutionized the field of cryptography. Understanding, before almost anyone, the power that springs from encoding information in a simple language of 1's and 0's, Dr. Shannon as a young scientist at Bell Laboratories wrote two papers that remain monuments in the fields of computer science and information theory. "Shannon was the person who saw that the binary digit was the fundamental element in all of communication," said Dr. Robert G. Gallager, a professor of electrical engineering who worked with Dr. Shannon at the Massachusetts Institute of Technology. "That was really his discovery, and from it the whole communications revolution has sprung."
Shannon’s most important paper, ‘A mathematical theory of communication,’ was published in 1948. This fundamental treatise both defined a mathematical notion by which information could be quantified and demonstrated that information could be delivered reliably over imperfect communication channels like phone lines or wireless connections. These groundbreaking innovations provided the tools that ushered in the information age. As noted by Ioan James, Shannon biographer for the Royal Society, “So wide were its repercussions that the theory was described as one of humanity’s proudest and rarest creations, a general scientific theory that could profoundly and rapidly alter humanity’s view of the world.” Shannon went on to develop many other important ideas whose impact expanded well beyond the field of “information theory” spawned by his 1948 paper.
Shannon approached research with a sense of curiosity, humor, and fun. An accomplished unicyclist, he was famous for cycling the halls of Bell Labs at night, juggling as he went. His later work on chess-playing machines and an electronic mouse that could run a maze helped create the field of artificial intelligence, the effort to make machines that think. And his ability to combine abstract thinking with a practical approach — he had a penchant for building machines — inspired a generation of computer scientists. Dr. Marvin Minsky of M.I.T., who as a young theorist worked closely with Dr. Shannon, was struck by his enthusiasm and enterprise. "Whatever came up, he engaged it with joy, and he attacked it with some surprising resource — which might be some new kind of technical concept or a hammer and saw with some scraps of wood," Dr. Minsky said. "For him, the harder a problem might seem, the better the chance to find something new."
While Shannon worked in a field for which no Nobel prize is offered, his work was richly rewarded by honors including the National Medal of Science (1966) and honorary degrees from Yale (1954), Michigan (1961), Princeton (1962), Edin- burgh (1964), Pittsburgh (1964), Northwestern (1970), Oxford (1978), East Anglia (1982), Carnegie-Mellon (1984), Tufts (1987), and the University of Pennsylvania (1991). He was also the first recipient of the Harvey Prize (1972), the Kyoto Prize (1985), and the Shannon Award (1973). The last of these awards, named in his honor, is given by the Information Theory Society of the Institute of Electrical and Electronics Engineers (IEEE) and remains the highest possible honor in the community of researchers dedicated to the field that he invented. His Collected Papers, published in 1993, contains 127 publications on topics ranging from communications to computing, and juggling to “mind-reading” machines.
|