Divergence-Based Robust Estimators for Finite Survey Samples with Random Weights

D. Kepplinger1 A.N. Vidyashankar2
  • 1

    Department of Statistics, George Mason University, Fairfax, VA, USA [dkepplin@gmu.edu]

  • 2

    Department of Statistics, George Mason University, Fairfax, VA, USA [avidyash@gmu.edu]

Keywords: Divergence-based inference; Minimum Hellinger Distance; Superpopulation model; Survey sampling;

We present a robust and efficient divergence-based estimator for parameters in a finite population. Specifically, we investigate the behavior of the Minimum Hellinger Distance estimator under possibly biased sampling in a finite population when sampling weights are themselves random or not known a priori. The classical estimators in this finite population model are highly affected by outliers, both in the quantities of interest and the sampling weights. Minimum distance estimators, on the other hand, are known to be highly efficient and robust in the usual i.i.d. setting. We show that many of these properties carry forward to biased survey sampling with random weights under the superpopulation model. We demonstrate the utility of this estimator on simulated data and on a real-world survey sample.