
Abstract This study introduces the constrained forms of the Shannon information entropy and Kullback–Leibler cross-entropy functions. Applicable to a closed system, these functions incorporate the constraints on the system in a generic fashion, irrespective of the form or even the number of the constraints. Since they are fully constrained, the constrained functions may be “pulled apart” to give their partial or local constrained forms, providing the means to examine the probability of each outcome or state relative to its local maximum entropy (or minimum cross-entropy) position, independently of the other states of the system. The Shannon entropy and Kullback–Leibler cross-entropy functions do not possess this functionality. The features of each function are examined in detail, and the similarities between the constrained and Tsallis entropy functions are examined. The continuous constrained entropy and cross-entropy functions are also introduced, the integrands of which may be used to independently examine each infinitesimal state (physical element) of the system.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 13 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
