Concentration inequalities for the sample mean, like those due to Bernstein and Hoeffding, are valid for any sample size but overly conservative, yielding confidence intervals that are unnecessarily wide. In this talk, motivated by applications to reinforcement learning we develop new results on transport and information theoretic distances. This allows us to obtain new computable concentration inequalities with asymptotically optimal size, finite-sample validity, and sub-Gaussian decay. These bounds enable the construction of efficient confidence intervals with correct coverage for any sample size. We derive our inequalities by tightly bounding the Hellinger distance, Stein discrepancy, non-uniform Kolmogorov distance, and Wasserstein distance to a Gaussian.