
The historical search for the fundamental meaning of thermodynamic entropy lead to the discovery of the connection between entropy and information about microscopic dynamics, which in turn motivated the development of the theory of information in communications. In this paper I will review how information theory can profitably be applied back to its roots in dynamics in order to characterize the essential properties of a measured time series. Entropy was introduced in the familiar modern form (dS = SQ/T) by Clausius in 1854, building on work by Carnot (1824) and Kelvin (-1850) to understand the nature of heat and irreversability in thermodynamic systems.'I2 Boltzmann was dedicated to understanding the microscopic meaning of this macroscopic entropy; influenced by Maxwell's kinetic theory of gases (-1860) he introduced the relationship H = f(r,p,t)log f(r,p,t)d3rd3p (where f is the velocity distribution in a gas) in 1872, and then the more general form S = k log W (where W is the number of available states) around 1877. In 1871 Maxwell created his dem~n,~ which appeared to violate the second law of thermodynamics by intelligent action. Szilard, in 1929, made the significant step of considering a one-molecule gas that could be on either side of a partition; this introducing the idea of a binary bit of information (which side of the partition the molecule is in) and of using entropy to measure inf~rmation.~ Although Sxilard missed the crucial role of erase in explaining the demon's (mis)behavior (this was first recognized by Landauer5), he had laid the foundation for the development of both reversible computation6 and of information the~ry.~ In 1948 Shannon applied entropy to measure the information content in an arbitrary message, independent of its physical origin, and was thereby able to solve significant outstanding communications problems such as the maximum rate at which a message can be sent through a channel. Information theory has since flourished in engineering practice.8 The study of ergodic systems has both benefited from, and contributed to, information the~ry.~ More recently, Shaw pointed out the connection between information and dissipative dynamics," and Fraser extended this to develop a framework that I will describe here for characterizing the structure in time series produced by non-linear systems.l' Although a dynamical system's global structure can perform nontrivial computations,12i13 analyzing the information evolution associated with the much simpler local behavior is sufficient to answer deep questions about the complexity and predictability of a system. In this paper I will will explain how to understand and measure such information. As well as being quite useful in practice, this application of information theory back to its roots in dynamics provides a simple but clear example of the physical meaning of inf~rniation.'~ Assume that a physical system is described by a state vector 5 and governing equations d21dt = f (5) (or &+I = f(&)). This need not imply that the system is finite-dimensional; the underlying governing equations may be infinite-dimensional partial differential equations which reduce to a finite-dimensional mode expansion due to dissipation. Let y(g(t)) be a scalar experimentally-accessible quantity that is a function of the state of the system (such as the temperature or velocity at a point in a fluid convection cell, or the concentration of a particular species in a chemical reaction). The goal is to learn as much as possible about the underlying system given only the time series y(t). The necessary connection between the observed time series and its physical origin is provided by state-space reconstruction.15- 's Construct a new vector out of lagged copies of the observable and the dimension d are parameters that will be discussed shortly. If d is large enough, thzn for almost any choice of the governing equations f, the observable u(Z), and the time delay T, the motion of the zt 4 = (yt, ytbT,. . ., yt.-(d-.llT), where the time lag T
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 2 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
