Low-Complexity Methods For Estimation After Parameter Selection
Nadav Harel, Tirza Routtenberg
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:14:55
Statistical inference of multiple parameters often involves a preliminary parameter selection stage. The selection stage has an impact on subsequent estimation, for example by introducing a selection bias. The post-selection maximum likelihood (PSML) estimator is shown to reduce the selection bias and the post-selection mean-squared-error (PSMSE) compared with conventional estimators, such as the maximum likelihood (ML) estimator. However, the computational complexity of the PSML is usually high due to the multi-dimensional exhaustive search for a global maximum of the post-selection log-likelihood (PSLL) function. Moreover, the PSLL involves the probability of selection that, in general, does not have an analytical form. In this paper, we develop new low-complexity post-selection estimation methods for a two-stage estimation after parameter selection architecture. The methods are based on implementing the iterative maximization by parts (MBP) approach, which is based on the decomposition of the PSLL function into "easily-optimized" and complicated parts. For low-complexity performance analysis, we develop the empirical post-selection Cram´er-Rao-type lower bound. Simulations demonstrate that the proposed post-selection estimation methods are tractable and reduce both the bias and the PSMSE, compared with the ML estimator, while only requiring moderate computational complexity.
Chairs:
Soosan Beheshti