Information, Entropy and Their Geometric Structures
Information theory was founded in the 1950s based on the work of Claude Shannon and Jacques Laplume in communication and Léon Brillouin in statistical physics, among other main contributors. These foundations have conventionally been built on linear algebra theory and probability models in conventional spaces (vector space, normed spaces, ...).
At the turn of the century, new and fruitful interactions were found between several branches of science: Information Science (information theory, digital communications, statistical signal processing, …), Mathematics (group theory, geometry and topology, probability, statistics, ...) and Physical Sciences (geometric mechanics, thermodynamics, statistical physics, quantum mechanics, ...).
The probability theory was conceived by Blaise Pascal and Jacob Bernoulli. Pierre de Fermat also helped in his exchange of correspondence with Blaise Pascal to develop the foundations of probability theory, a mathematical accident that caused the study of Chevalier de Méré’s game (Antoine Gombaud, Chevalier de Méré, a French nobleman with an interest in gaming and gambling questions, called Pascal’s attention to an apparent contradiction concerning a popular dice game). Then, probability theory was consolidated by many contributors, such as Pierre Simon Laplace, Abraham de Moivre and Carl Friedrich Gauss during the XVIII century and by Emile Borel, Andreï Kolmogorov and Paul Levy last century. Probability is again the subject of a new foundation to apprehend new structures and generalize the theory to more abstract spaces (metric spaces, homogeneous manifolds, graphs ....). A first attempt at probability generalization in metric spaces was developed by Maurice Fréchet in the middle of last century, in the framework of abstract spaces topologically affine and “distance space” (“espace distancié”) with triangular inequality constraint.
The book in numbers
rate scoreNothing yet...
Social likesNothing yet...