
Gaussian Mixture Models (GMMs) are powerful tools for probability density modeling and soft clustering. They are widely used in data mining, signal processing and computer vision. In many applications, we need to estimate the parameters of a GMM from data before working with it. This task can be handled by the Expectation-Maximization algorithm for Gaussian Mixture Models (EM-GMM), which is computationally demanding. In this paper we present our FPGA-based solution for the EM-GMM algorithm. We propose a pipeline-friendly EM-GMM algorithm, a variant of the original EM-GMM algorithm that can be converted to a fully-pipelined hardware architecture. To further improve the performance, we design a Gaussian probability density function evaluation unit that works with fixed-point arithmetic. In the experiments, our FPGA-based solution generates fairly accurate results while achieving a maximum of 517 times speedup over a CPU-based solution, and 28 times speedup over a GPU-based solution.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 20 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
