The Cramér-Rao bound (CRB) quantifies the variance lower bound for unbiased estimators, but it is intractable to evaluate in linear hierarchical Bayesian models with non-Gaussian priors due to the intractable marginal likelihood. Existing methods, including variational Bayes and Markov chain Monte Carlo (MCMC)-based approaches, often have high computational cost and blackslow convergence. We propose an efficient framework to approximate the Fisher information matrix (FIM) and the CRB by expressing the gradient of the log marginal likelihood as a posterior expectation. Expectation propagation (EP) is used to approximate the posterior as a Gaussian, enabling accurate moment estimation compablack to pure sampling-based methods. Numerical experiments on small-scale sparse models show that the EP-based CRB approximation achieves lower black average normalized mean squared error (NMSE) and faster convergence than classical baselines in non-Gaussian settings.
Efficient CRB estimation for linear models via expectation propagation and Monte Carlo sampling
IEEE Signal Processing Letters, 25 December 2025
Type:
Journal
Date:
2025-12-25
Department:
Systèmes de Communication
Eurecom Ref:
8550
Copyright:
© 2025 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
See also:
PERMALINK : https://www.eurecom.fr/publication/8550