An Introduction to Density-Based Minimum Distance Methods
A short course of Mathematics for Data Science at the Department of Mathematics
|
Bio
Ayanendranath Basu is a Professor at the Interdisciplinary Statistical Research Unit, Indian Statistical Institute, Kolkata. He received his Ph.D. from the Pennsylvania State University, USA, in 1991, working under the supervision of Professor Bruce G. Lindsay. Subsequently he spent four years at the Department of Mathematics, University of Texas, Austin, as an Assistant Professor. In 1995 he joined the Indian Statistical Institute where he currently holds the post of a Professor. His research interests are in the areas of minimum distance inference, robust inference, multivariate analysis, categorical data and biostatistics. He has written over one hundred peer reviewed articles in reputed international journals, has co-authored two books and has jointly edited four books. He has supervised four Ph.D. thesis, and is currently supervising three more. He is a fellow of the Indian National Academy of Sciences and West Bengal Academy of Science and Technology. He was also the recipient of the C.R. Rao National Award in Statistics, 2007. He is a former editor of Sankhya, the Indian Journal of Statistics, Series B, and belongs to the editorial board of several other international journals.Abstract of the course
The approach to statistical inference based on the minimization of a suitable statistical distance (or divergence) is a useful and popular technique. It has natural applications in the areas of parametric estimation and parametric tests of hypothesis, together with many non-parametric uses. It is also widely used in other fields such as information theory, machine learning and pattern recognition. The scope of application of minimum distance techniques has increased greatly over the last three or four decades with the emergence of the area of robustness, where these techniques are seen to be extremely useful. At the same time, complicated divergence based applications have become easy to implement with the exponential growth in computing power.
In the literature, two broad types of distance/divergences are available; they are:
- Distances between the distribution function of the data (such as the empirical distribution function or its refinements) and the model distribution function. Examples include the Kolmogorov-Smirnov distance, the Cramer-von Mises distance, the Anderson-Darling distance, etc.
- Distances between the probability density (or mass) function of data (such as a non-parametric density estimator or the vector of relative frequencies) and the model density. Examples include the likelihood disparity, the Pearson’s chi-square, the Hellinger distance, the Kullback-Leibler divergence, the family of φ-divergence measures, the Bregman divergences, the Burbea-Rao divergence, etc.
In these set of lectures our emphasis will be on density-based divergences, more particularly on the φ-divergence type measures. The φ-divergences and the Bregman divergences are the two most general classes of density based divergences, with the likelihood disparity being the divergence common to both families. We will primarily focus our attention on φ-divergence type measures or disparities, but briefly describe some common Bregman divergences during the later part of our discussion.
The minimum distance methods assumed greater importance when the idea of robustness was developed, and our entire description of the development of these type of methods will be from the point of view of robustness. In this connection we will show that the minimum distance estimators based on φ-divergences have estimating equations but are not M-estimators. But the minimim divergence estimators based on Bregman divergences are legitimate M-estimators.
The lectures will focus on the development of φ-divergences, their influence functions and asymptotic efficiency, small sample deficiency of these estimators, weighted likelihood estimators that follow from the φ-divergence idea, and will conclude with a brief description of Bregman type divergences and related inference. Brief indication of the uses of these divergences in multinomial goodness-of-fit testing will also be provided.
Schedule
- Wednesday 2018/07/11 09.00-13.00
Details
- Venue: Polo Scientifico e Tecnologico F. Ferrari – Room A108
- Language: English
- Admission: Course is open to everyone and free.
- How to apply: send an email to Claudio Agostinelli