ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.03557
11
1

Improvements in the Small Sample Efficiency of the Minimum SSS-Divergence Estimators under Discrete Models

12 February 2017
A. Ghosh
A. Basu
ArXivPDFHTML
Abstract

This paper considers the problem of inliers and empty cells and the resulting issue of relative inefficiency in estimation under pure samples from a discrete population when the sample size is small. Many minimum divergence estimators in the SSS-divergence family, although possessing very strong outlier stability properties, often have very poor small sample efficiency in the presence of inliers and some are not even defined in the presence of a single empty cell; this limits the practical applicability of these estimators, in spite of their otherwise sound robustness properties and high asymptotic efficiency. Here, we will study a penalized version of the SSS-divergences such that the resulting minimum divergence estimators are free from these issues without altering their robustness properties and asymptotic efficiencies. We will give a general proof for the asymptotic properties of these minimum penalized SSS-divergence estimators. This provides a significant addition to the literature as the asymptotics of penalized divergences which are not finitely defined are currently unavailable in the literature. The small sample advantages of the minimum penalized SSS-divergence estimators are examined through an extensive simulation study and some empirical suggestions regarding the choice of the relevant underlying tuning parameters are also provided.

View on arXiv
Comments on this paper