Gaussian mixture noise can model non-Gaussian noise and also be used when outliers are present. For deterministic maximum likelihood direction finding in Gaussian mixture noise, the Space-Alternating Generalized Expectation-maximization (SAGE) algorithm, an extension of the expectation-maximization algorithm, was applied and designed by Kozick and Sadler twenty odd years ago, which simultaneously updates direction of arrival (DOA) estimates at each iteration and cannot properly converge under unequal signal powers. In this article, the Alternating Expectation-Conditional Maximization (AECM) algorithm, an extension of the SAGE algorithm, is applied and designed, which utilizes multiple less informative versions of the complete data and the golden section search method to update DOA estimates at each iteration sequentially (one by one). Theoretical analysis shows that the AECM algorithm has almost the same computational complexity of each iteration as the SAGE algorithm. However, numerical results show that the AECM algorithm yields faster stable convergence and is computationally more efficient.
翻译:暂无翻译