ROBUST PARAMETER ESTIMATION BASED ON THE K-DIVERGENCE
Yair Sorek, Koby Todros
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:09:27
In this paper we present a new divergence, called $\mathcal{K}$-divergence, that involves a weighted version of the hypothesized log-likelihood function. To down-weight low density areas, attributed to outliers, the corresponding weight function is a convolved version of the underlying density with a strictly positive smoothing ?$\mathcal{K}$?ernel function parameterized by a bandwidth parameter. The resulting minimum $\mathcal{K}$-divergence estimator (M$\mathcal{K}$DE) operates by minimizing the empirical $\mathcal{K}$-divergence w.r.t. the vector parameter of interest. The M$\mathcal{K}$DE utilizes Parzen's non-parametric kernel density estimator, arising from the nature of the weight function, to suppress outliers. By proper selection of the kernel's bandwidth parameter we show that the M$\mathcal{K}$DE can gain enhanced estimation performance along with implementation simplicity as compared to other robust estimators.