
The expectation-maximization iterative algorithm is widely used in parameter estimation when dealing with missing information. Such a situation can naturally arise when we observe the data of our interest on a bounded observation window. This thesis focuses on the application of the EM algorithm for truncated Gaussian mixtures and compares the proposed algorithm with the approach in a previously published article, see Lee and Scott [2012], where it uses a heuristic simplification and is not sufficiently supported mathematically. We also compare the behavior of the proposed algorithm with the procedure from the article in a series of simulated experiments, as well as in analyzing a real dataset. We also provide Python implementation of the EM algorithm for truncated Gaussian mixtures.
směs rozdělení; multivariate normal distribution; mixture distribution; EM algoritmus; neúplná pozorování; truncated observations; mnohorozměrné normální rozdělení; EM algorithm
směs rozdělení; multivariate normal distribution; mixture distribution; EM algoritmus; neúplná pozorování; truncated observations; mnohorozměrné normální rozdělení; EM algorithm
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
