
pmid: 12662553
The high-order Boltzmann machine (HOBM) approximates probability distributions defined on a set of binary variables, through a learning algorithm that uses Monte Carlo methods. The approximation distribution is a normalized exponential of a consensus function formed by high-degree terms and the structure of the HOBM is given by the set of weighted connections. We prove the convexity of the Kullback-Leibler divergence between the distribution to learn and the approximation distribution of the HOBM. We prove the convergence of the learning algorithm to the strict global minimum of the divergence, which corresponds to the maximum likelihood estimate of the connection weights, establishing the uniqueness of the solution. These theoretical results do not hold in the conventional Boltzmann machine, where the consensus function has first and second-degree terms and hidden units are used. Copyright 1996 Elsevier Science Ltd.
Learning and adaptive systems in artificial intelligence, Monte Carlo methods, high-order Boltzmann machine
Learning and adaptive systems in artificial intelligence, Monte Carlo methods, high-order Boltzmann machine
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 1 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
