Who is Claude Shannon and why is he famous? K. Shannon's information theory Shannon's theory in brief


Claude Ellwood Shannon was an award-winning American mathematician, electronics engineer, and cryptographer known as the creator of information theory.


It was our hero who once proposed using the concept of “bit”, known to everyone today, as the equivalent of the smallest unit of information.

Shannon became famous as the man who gave birth to information theory in a landmark paper he published in 1948. In addition, he is also credited with the idea of ​​​​creating the digital computer and digital technologies in general, back in 1937, when Shannon was a 21-year-old student at the Massachusetts Institute of Technology working on his master's degree - he then wrote a dissertation in which he demonstrated that the use of Boolean algebras in the field of electronics could construct and solve any logical, numerical

communications. An article based on his dissertation earned him a prize from the American Institute of Electrical Engineers in 1940.

During World War II, Shannon made significant contributions to the field of cryptanalysis while working on national defense, including his seminal project on breaking codes and ensuring secure telecommunications.

Shannon was born on April 30, 1916 in Petoskey, Michigan, and grew up in nearby Gaylord, Michigan. His father was one of those self-made men. A descendant of early New Jersey settlers, he was a businessman and judge. Claude's mother taught English and for some time headed the

Gaylord Elementary School. Shannon spent most of the first 16 years of his life in Gaylord, and graduated from the local school in 1932. Since childhood, he was interested in designing mechanical and electrical models. His favorite subjects were science and mathematics, and in his free time at home, he built model airplanes, a radio-controlled model boat, and even a wireless telegraph that connected him to the house of a friend who lived half a mile from the Shannons.

As a teenager, Claude worked part-time as a courier for Western Union. His childhood hero was Thomas Edison, who, as it later turned out, was also a distant relative. They were both descendants

ami John Ogden, a 17th-century colonial leader and ancestor of many prominent people. What Shannon wasn't interested in was politics. Moreover, he was an atheist.

In 1932, Claude became a student at the University of Michigan, where one of the courses introduced him to the intricacies of Boole algebra. After graduating in 1936 with two bachelor's degrees, in mathematics and electrical engineering, he continued his studies at MIT, where he worked on one of the first analog computers, the Vannevar Bush differential analyzer - it was then that he realized that the concepts of Boolean algebra could be applied to more useful. Shannon's thesis for the degree m

master's thesis was entitled "Symbolic Analysis of Relays and Switches" and is considered by experts to be one of the most important master's theses of the 20th century.

In the spring of 1940, Shannon received his doctorate in mathematics from MIT with a dissertation on "Algebra for Theoretical Genetics," and for the next 19 years, from 1941 to 1956, he taught at the University of Michigan and worked at Bell Labs, where his interest was sparked by fire protection systems. and cryptography (this is what he did during World War II).

At Bell Labs, Shannon met his future wife, Betty Shannon, who worked in numerical analysis. They married in 1949. In 1956, Shannon returned to MIT,

where he was offered a chair, and worked there for 22 years.

His hobbies included juggling, unicycle riding and chess. He invented a variety of fun gadgets, including rocket-powered flying discs, a motorized grasshopper, and a fire-emitting tube for a science fair. He is also credited, along with Edward O. Thorp, as the inventor of the first portable computer - they used this device to improve the chances of winning at roulette, and their forays into Las Vegas were very successful.

Shannon spent his final years in a nursing home, suffering from Alzheimer's disease. He passed away on February 24, 2001.

Anatoly Ushakov, Doctor of Technical Sciences, Prof. department control systems and informatics, ITMO University

Many generations of technical specialists of the second half of the 20th century, even those quite far from the theory of automatic control and cybernetics, having left the walls of universities, remembered for the rest of their lives the names of the “author’s” scientific and technical achievements: Lyapunov functions, Markov processes, frequency and Nyquist criterion, Wiener process , Kalman filter. Among such achievements, Shannon's theorems take pride of place. 2016 marks the hundredth anniversary of the birth of their author, scientist and engineer Claude Shannon.

“Who owns the information, owns the world”

W. Churchill

Rice. 1. Claude Shannon (1916–2001)

Claude Elwood Shannon (Fig. 1) was born on April 30, 1916 in the city of Petocki, located on the shores of Lake Michigan, Michigan (USA), in the family of a lawyer and foreign language teacher. His older sister Katherine was interested in mathematics and eventually became a professor, and Shannon's father combined his work as a lawyer with amateur radio. A distant relative of the future engineer was the world-famous inventor Thomas Edison, who had 1093 patents.

Shannon graduated from comprehensive high school in 1932 at the age of sixteen, while receiving additional education at home. His father bought him construction sets and amateur radio sets and contributed in every possible way to his son’s technical creativity, and his sister involved him in advanced mathematics studies. Shannon fell in love with both worlds - engineering and mathematics.

In 1932, Shannon entered the University of Michigan, from which he graduated in 1936, receiving a bachelor's degree with a double major in mathematics and electrical engineering. During his studies, he found in the university library two works by George Boole - “Mathematical Analysis of Logic” and “Logical Calculus”, written in 1847 and 1848, respectively. Shannon studied them carefully, and this, apparently, determined his future scientific interests.

After graduation, Claude Shannon took a job at the Massachusetts Institute of Technology (MIT) Electrical Engineering Laboratory as a research assistant, where he worked on upgrading the differential analyzer of Vannevar Bush, vice president of MIT, an analog “computer.” From that time on, Vannevar Bush became Claude Shannon's scientific mentor. While studying the complex, highly specialized relay and switching circuitry of the differential analyzer control device, Shannon realized that George Boole's concepts could be put to good use in this area.

At the end of 1936, Shannon entered the master's program, and already in 1937 he wrote the abstract of his dissertation for a master's degree and, on its basis, prepared the article “Symbolic Analysis of Relays and Switching Circuits,” which was published in 1938 in the publication of the American Institute Electrical Engineers (AIEE). This work attracted the attention of the scientific electrical engineering community, and in 1939 the American Society of Civil Engineers awarded Shannon the Alfred Nobel Prize for it.

Having not yet defended his master's thesis, Shannon, on the advice of Bush, decided to work on a doctorate in mathematics at MIT, concerning problems in genetics. According to Bush, genetics could be a successful problem area for applying Shannon's knowledge. Shannon's doctoral dissertation, entitled “Algebra for Theoretical Genetics,” was completed in the spring of 1940 and was devoted to problems of gene combinatorics. Shannon received his doctorate in mathematics and at the same time defended his thesis on “Symbolic analysis of relays and switching circuits”, becoming a master of electrical engineering.

Shannon's doctoral dissertation did not receive much support from geneticists and for this reason was never published. However, the master's thesis turned out to be a breakthrough in switching and digital technology. The last chapter of the dissertation gave many examples of the successful application of the logical calculus developed by Shannon to the analysis and synthesis of specific relay and switching circuits: selector circuits, a lock with an electrical secret, binary adders. All of them clearly demonstrate the scientific breakthrough accomplished by Shannon and the enormous practical benefits of the formalism of logical calculus. This is how digital logic was born.

Rice. 2. Claude Shannon at Bell Labs (mid-1940s)

In the spring of 1941, Claude Shannon became an employee of the mathematics department of the Bell Laboratories research center (Fig. 2). A few words should be said about the atmosphere in which 25-year-old Claude Shannon found himself - it was created by Harry Nyquist, Henrik Bode, Ralph Hartley, John Tukey and other Bell employees Laboratories. All of them already had certain results in the development of information theory, which Shannon would eventually develop to the level of big science.

At this time, war was already going on in Europe, and Shannon was conducting research that was widely funded by the US government. The work Shannon did at Bell Laboratories was related to cryptography, which led him to work on the mathematical theory of cryptography and eventually allowed him to analyze ciphertexts using information-theoretic methods (Figure 3).

In 1945, Shannon completed a large secret scientific report on the topic “Communication Theory of Secrecy Systems”.

Rice. 3. At the encryption machine

At this time, Claude Shannon was already close to speaking to the scientific community with new basic concepts in information theory. And in 1948 he published his landmark work “Mathematical Theory of Communications”. Shannon's mathematical theory of communication assumed a three-component structure, composed of a source of information, a receiver of information and a “transport medium” - a communication channel characterized by throughput and the ability to distort information during transmission. A certain range of problems arose: how to quantify information, how to effectively package it, how to estimate the permissible speed of outputting information from a source to a communication channel with a fixed bandwidth in order to guarantee error-free transmission of information, and, finally, how to solve the last problem in the presence of interference in the channel connections? Claude Shannon gave humanity comprehensive answers to all these questions with his theorems.

It should be said that his colleagues in the “shop” helped Shannon with terminology. Thus, the term for the minimum unit of information amount - “bit” - was proposed by John Tukey, and the term for estimating the average amount of information per symbol of the source - “entropy” - John von Neumann. Claude Shannon presented his seminal work in the form of twenty-three theorems. Not all theorems are equivalent, some of them are of an auxiliary nature or are devoted to special cases of information theory and its transmission over discrete and continuous communication channels, but six theorems are conceptual and form the framework of the building of information theory created by Claude Shannon.

  1. The first of these six theorems is related to the quantitative assessment of information generated by a source of information, within the framework of a stochastic approach based on a measure in the form of entropy indicating its properties.
  2. The second theorem is devoted to the problem of rational packing of symbols generated by a source during their primary encoding. It gave rise to an effective coding procedure and the need to introduce a “source encoder” into the structure of the information transmission system.
  3. The third theorem concerns the problem of matching the flow of information from the information source with the capacity of the communication channel in the absence of interference, which guarantees the absence of distortion of information during transmission.
  4. The fourth theorem solves the same problem as the previous one, but in the presence of interference in the binary communication channel, the effects of which on the transmitted code message contribute to the probability of distortion of an arbitrary code bit. The theorem contains a transmission slowdown condition that guarantees a given probability of error-free delivery of the code message to the recipient. This theorem is the methodological basis of noise-protective coding, which led to the need to introduce a “channel encoder” into the structure of the transmission system.
  5. The fifth theorem is devoted to estimating the capacity of a continuous communication channel, characterized by a certain frequency bandwidth and given powers of the useful signal and the interference signal in the communication channel. The theorem defines the so-called Shannon limit.
  6. The last of the theorems, called the Nyquist-Shannon-Kotelnikov theorem, is devoted to the problem of error-free reconstruction of a continuous signal from its time-discrete samples, which allows us to formulate a requirement for the value of the discrete time interval, determined by the width of the frequency spectrum of the continuous signal, and to form basis functions called reference functions .

It should be said that initially many mathematicians around the world had doubts about the evidence base of these theorems. But over time, the scientific community became convinced of the correctness of all postulates, finding mathematical confirmation for them. In our country, A.Ya. Khinchin devoted his efforts to this matter. and Kolmogorov A.N. .

In 1956, the famous Claude Shannon left Bell Laboratories without breaking ties with it, and became a full professor at two faculties at the Massachusetts Institute of Technology: mathematics and electrical engineering.

Rice. 4. Shannon's Labyrinth

Claude Shannon always had many interests completely unrelated to his professional activities. Shannon's outstanding engineering talent was manifested in the creation of all kinds of machines and mechanisms, including the mechanical Theseus mouse, which solves a labyrinth problem (Fig. 4), a computer with operations on Roman numerals, as well as computers and programs for playing chess.

In 1966, at the age of 50, Claude Shannon retired from teaching and devoted himself almost entirely to his hobbies. He creates a unicycle with two saddles, a folding knife with a hundred blades, robots that solve a Rubik's cube, and a robot that juggles balls. In addition, Shannon himself continues to hone his juggling skills, bringing the number of balls to four (Fig. 5). Witnesses of his youth at Bell Laboratories recalled how he rode around the company's corridors on a unicycle, while juggling balls.

Rice. 5. Claude Shannon - juggler

Unfortunately, Claude Shannon did not have close contacts with Soviet scientists. Nevertheless, he managed to visit the USSR in 1965 at the invitation of the Scientific and Technical Society of Radio Engineering, Electronics and Communications (NTORES) named after A.S. Popova. One of the initiators of this invitation was multiple world chess champion Mikhail Botvinnik, Doctor of Technical Sciences, professor, who was also an electrical engineer and was interested in chess programming. A lively discussion took place between Mikhail Botvinnik and Claude Shannon about the problems of computerizing the art of chess. The participants came to the conclusion that this was very interesting for programming and unpromising for chess. After the discussion, Shannon asked Botvinnik to play chess with him and during the game he even had a slight advantage (a rook for a knight and a pawn), but still lost on the 42nd move.

During the last years of his life, Claude Shannon was seriously ill. He died in February 2001 in a Massachusetts nursing home from Alzheimer's disease at the age of 85.

Claude Shannon left a rich applied and philosophical legacy. He created a general theory of discrete automation and computer technology devices, a technology for effectively using the capabilities of the channel medium. All modern archivers used in the computer world rely on Shannon's efficient coding theorem. The basis of his philosophical heritage consists of two ideas. First: the goal of any management should be to reduce entropy as a measure of uncertainty and disorder in the system environment. Management that does not solve this problem is redundant, i.e. unnecessary. The second is that everything in this world is, in some sense, a “communication channel.” The communication channel is a person, a team, an entire functional environment, industry, a transport structure, and the country as a whole. And if you do not coordinate technical, informational, humanitarian, governmental solutions with the capacity of the channel environment for which they are designed, then do not expect good results.

In contact with

Literature

  1. Shannon C. E. A Mathematical Theory of Communication. Bell Systems Technical Journal. July and Oct. 1948 // Claude Elwood Shannon. Collected Papers. N.Y., 1993. P. 8-111.
  2. Shannon C. E. Communication in the presence of noise. Proc.IRE. 1949. V. 37. No. 10.
  3. Shannon C. E. Communication Theory of Secrecy Systems. Bell Systems Technical Journal. July and Oct. 1948 // Claude Elwood Shannon. Collected Papers. N.Y., 1993. P. 112-195.
  4. Automatic machines. Collection of articles ed. K. E. Shannon, J. McCarthy / Trans. from English M.: From-in. lit. 1956.
  5. Robert M. Fano Transmission of information: A statistical theory of communication. Published Jointly by the M.I.T., PRESS and JOHN WILEY & SONS, INC. New York, London. 1961.
  6. www. research.att. com/~njas/doc/ces5.html.
  7. Kolmogorov A. N. Preface // Works on information theory and cybernetics / K. Shannon; lane from English under. ed. R.L. Dobrushina and O.B. Lupanova; preface A. N. Kolmogorov. M., 1963.
  8. Levin V.I.K.E. Shannon and modern science // Bulletin of TSTU. 2008. Volume 14. No. 3.
  9. Viner N. Ya. – mathematician / Transl. from English M.: Science. 1964.
  10. Khinchin A. Ya. On the main theorems of information theory. UMN 11:1 (67) 1956.
  11. Kolmogorov A. N. Theory of information transmission. // Session of the USSR Academy of Sciences on scientific problems of production automation. October 15–20, 1956 Plenary session. M.: Publishing House of the USSR Academy of Sciences, 1957.
  12. Kolmogorov A. N. Information theory and algorithms theory. M.: Nauka, 1987.

Claude Ellwood Shannon was a famous American engineer and mathematician. His works combine the connection of mathematical ideas with the analysis of a very complex process of their technical implementation. Claude Shannon is famous primarily for his development of information theory, which serves as the basis for modern high-tech communication systems. Shannon made a huge contribution to a number of sciences that are included in the concept of “cybernetics” - he created the theory of probability of circuits, the theory of automata and control systems.

Claude Shannon - the making of an engineering genius

Claude Shannon was born in 1916 in Gaylord, Michigan, USA. Technical structures, as well as the generality of mathematical processes, interested him from an early age. All his free time, he solved mathematical problems and tinkered with radio constructors and detector receivers.

It is not surprising that as a student at the University of Michigan, Shannon double majored in mathematics and electrical engineering. Thanks to his high level of education and variety of interests, Shannon’s first huge success came while he was studying as a graduate student at the Massachusetts Institute of Technology. Then he was able to prove that the operation of electrical circuits of relays and switches can be represented through algebra. For this greatest discovery, Claude Shannon was awarded the Nobel Prize. He explained the reason for his stunning success quite modestly: “It’s just that no one before me studied mathematics and electrical engineering at the same time.”

Shannon and cryptography

In 1941, Shannon became an employee of Bell Laboratories, where his main task was the development of complex cryptographic systems. This work allowed him to create coding methods with error correction capabilities.

Claude Shannon was the first to approach the study of cryptography from a scientific point of view, publishing a paper in 1949 entitled “The Theory of Communications in Secret Systems.” This article consisted of three sections. The first section contained the basic mathematical structures of secret systems, the second revealed the problems of “theoretical secrecy,” and the third covered the concept of “practical secrecy.” Thus, Shannon’s main merit in cryptography was a detailed study of the concept of absolute secrecy of systems, in which he proved the fact of the existence and the necessary conditions for the existence of absolutely strong, unbreakable ciphers.

Claude Shannon was the first to formulate the theoretical foundations of cryptography and reveal the essence of many concepts, without which cryptography as a science would not exist.

Founder of computer science

At some point in his career, Claude Shannon set himself the task of improving the transmission of information through telephone and telegraph channels, which are influenced by electrical noise. Then the scientist found out that the best solution to this problem would be more efficient “packaging” of information. However, before starting research, he had to answer the question of what information is and how to measure its quantity. In 1948, in the article “Mathematical Theory of Communication,” he described the definition of the amount of information in terms of entropy, a quantity known in thermodynamics as a measure of the disorder of a system, and called the smallest unit of information a “bit.”

Later, based on his definitions of the amount of information, Shannon was able to prove an ingenious theorem about the capacity of noisy communication channels. During the years of its development, the theorem did not find practical application, but in the modern world of high-speed microcircuits it finds application wherever information is stored, processed or transmitted.

Almost contemporary

Claude Shannon's contribution to science and his results can hardly be overestimated, because without his discoveries the existence of computer technology, the Internet and the entire digital space would have been impossible. In addition to the theories that laid the foundation for the development of information technology, the brilliant engineer and mathematician also made contributions to the development of many other areas. He was one of the first to prove that machines are not only capable of performing intellectual work, but also learning. In 1950, he invented a mechanical radio-controlled mouse that, thanks to a complex electronic circuit, could find its way to the laboratory on its own. He also became the author of a device that was capable of solving a Rubik's cube, and also invented the Hex, an electronic device for board games that always beat its opponents.

The brilliant scientist and inventor died at the age of 84 in 2001 from Alzheimer's disease in a Massachusetts nursing home.

Claude Shannon was born in 1916. He grew up in Gaylord, Michigan. Already as a child, Shannon showed an interest both in technology and its detailed study, and in general mathematical principles. He tinkered with the early detector receivers his father brought him while solving math problems and puzzles provided by his older sister, Catherine, who later became a mathematics professor.

In 1936, University of Michigan graduate Claude Shannon, then 21 years old, managed to bridge the gap between the algebraic theory of logic and its practical application.
Shannon, with two bachelor's degrees in electrical engineering and mathematics, acted as the operator of a clumsy mechanical computing device called a "differential analyzer", which Shannon's supervisor Professor Vanniver Bush built in 1930. For his dissertation topic, Bush suggested that Shannon study the logical organization of his machine. Gradually, Shannon began to develop the outlines of a computer. If electrical circuits were constructed according to the principles of Boolean algebra, they could express logical relationships, determine the truth of statements, and perform complex calculations.

Electrical circuits would obviously be much more convenient than the gears and rollers generously lubricated with machine oil in a “differential analyzer.” Shannon developed his ideas about the relationship between binary calculus, Boolean algebra, and electrical circuits in his doctoral dissertation published in 1938.

In 1941, 25-year-old Claude Shannon went to work at Bell Laboratories, where, among other things, he became famous for riding a unicycle through the laboratory's corridors while juggling balls.

At that time, the application to technology of the methods of the English scientist George Boole (1815-1864), who in 1847 published a work with the characteristic title “Mathematical analysis of logic, which is an experiment in the calculus of deductive reasoning,” was almost revolutionary. Shannon himself only modestly remarked to this: “It just happened that no one else was familiar with both areas at the same time.”

Another work of great value is the Communication Theory of Secrecy Systems (1949), which formulates the mathematical foundations of cryptography.

During the war, he was involved in the development of cryptographic systems, and this later helped him discover error-correcting coding methods. By the way, in the same forties, Shannon, for example, was engaged in the construction of a flying disk on a rocket engine. At the same time, Claude Elwood Shannon began to develop ideas that later formed the basis of the information theory that made him famous. Shannon's goal was to optimize the transmission of information over telephone and telegraph lines. And in order to solve this problem, he had to formulate what information is and how its quantity is determined. In his works of 1948-49, he defined the amount of information through entropy - a quantity known in thermodynamics and statistical physics as a measure of the disorder of a system, and as a unit of information he took what was later called a “bit”, that is, the choice of one of two equally probable options .

Since 1956 - member of the US National Academy of Sciences and the American Academy of Arts and Sciences.

In his works, Claude Shannon defined the amount of information through entropy - a quantity known in thermodynamics and statistical physics as a measure of the disorder of a system, and took as a unit of information what was later dubbed a “bit”, that is, the choice of one of two equally probable options. On the solid foundation of his definition of the quantity of information, Claude Shannon proved an amazing theorem about the capacity of noisy communication channels. This theorem was published in its entirety in his works of 1957-1961 and now bears his name. What is the essence of Shannon's theorem? Any noisy communication channel is characterized by its maximum information transfer rate, called the Shannon limit. At transmission speeds above this limit, errors in the transmitted information are inevitable. But from below this limit can be approached as close as desired, providing with appropriate coding of information an arbitrarily small probability of error for any noisy channel. In addition, Shannon was tirelessly engaged in various projects: from constructing an electronic mouse capable of finding a way out of a maze, to constructing juggling machines and creating a theory of juggling, which, however, did not help him beat his personal record - juggling four balls.