### Abstract

Nearest neighbor cells in R ^{d},d ∈ ℕ, are used to define coefficients of divergence (φ-divergences) between continuous multivariate samples. For large sample sizes, such distances are shown to be asymptotically normal with a variance depending on the underlying point density. In d = 1, this extends classical central limit theory for sum functions of spacings. The general results yield central limit theorems for logarithmic k-spacings, information gain, log-likelihood ratios and the number of pairs of sample points within a fixed distance of each other.

Original language | English |
---|---|

Pages (from-to) | 158-185 |

Number of pages | 28 |

Journal | Annals of Applied Probability |

Volume | 19 |

Issue number | 1 |

DOIs | |

Publication status | Published - Feb 1 2009 |

### Fingerprint

### All Science Journal Classification (ASJC) codes

- Statistics and Probability
- Statistics, Probability and Uncertainty

### Cite this

*Annals of Applied Probability*,

*19*(1), 158-185. https://doi.org/10.1214/08-AAP537

**Gaussian limits for generalized spacings.** / Baryshnikov, Yu; Penrose, Mathew D.; Yukich, J. E.

Research output: Contribution to journal › Article

*Annals of Applied Probability*, vol. 19, no. 1, pp. 158-185. https://doi.org/10.1214/08-AAP537

}

TY - JOUR

T1 - Gaussian limits for generalized spacings

AU - Baryshnikov, Yu

AU - Penrose, Mathew D.

AU - Yukich, J. E.

PY - 2009/2/1

Y1 - 2009/2/1

N2 - Nearest neighbor cells in R d,d ∈ ℕ, are used to define coefficients of divergence (φ-divergences) between continuous multivariate samples. For large sample sizes, such distances are shown to be asymptotically normal with a variance depending on the underlying point density. In d = 1, this extends classical central limit theory for sum functions of spacings. The general results yield central limit theorems for logarithmic k-spacings, information gain, log-likelihood ratios and the number of pairs of sample points within a fixed distance of each other.

AB - Nearest neighbor cells in R d,d ∈ ℕ, are used to define coefficients of divergence (φ-divergences) between continuous multivariate samples. For large sample sizes, such distances are shown to be asymptotically normal with a variance depending on the underlying point density. In d = 1, this extends classical central limit theory for sum functions of spacings. The general results yield central limit theorems for logarithmic k-spacings, information gain, log-likelihood ratios and the number of pairs of sample points within a fixed distance of each other.

UR - http://www.scopus.com/inward/record.url?scp=64149109685&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=64149109685&partnerID=8YFLogxK

U2 - 10.1214/08-AAP537

DO - 10.1214/08-AAP537

M3 - Article

AN - SCOPUS:64149109685

VL - 19

SP - 158

EP - 185

JO - Annals of Applied Probability

JF - Annals of Applied Probability

SN - 1050-5164

IS - 1

ER -