Refine
Optimal Error Bounds in Normal and Edgeworth Approximation of Symmetric Binomial and Related Laws
(2024)
This thesis explores local and global normal and Edgeworth approximations for symmetric
binomial distributions. Further, it examines the normal approximation of convolution powers
of continuous and discrete uniform distributions.
We obtain the optimal constant in the local central limit theorem for symmetric binomial
distributions and its analogs in higher-order Edgeworth approximation. Further, we offer a
novel proof for the known optimal constant in the global central limit theorem for symmetric
binomial distributions using Fourier inversion. We also consider the effect of simple continuity
correction in the global central limit theorem for symmetric binomial distributions. Here, and in
higher-order Edgeworth approximation, we found optimal constants and asymptotically sharp
bounds on the approximation error. Furthermore, we prove asymptotically sharp bounds on the
error in the local case of a relative normal approximation to symmetric binomial distributions.
Additionally, we provide asymptotically sharp bounds on the approximation error in the local
central limit theorem for convolution powers of continuous and discrete uniform distributions.
Our methods include Fourier inversion formulae, explicit inequalities, and Edgeworth expansions, some of which may be of independent interest.