Second-Order Asymptotically Optimal Statistical Classificatio
Author:Lin Zhou Time:July 7, 2019 Number of clicks:
Language:English
Conference:2019 IEEE International Symposium on Information Theory (ISIT)
Date of Publication:July 7, 2019
Abstract:
Motivated by real-world machine learning applications, we analyze approximations to the non-asymptotic fundamental limits of statistical classification. In the binary version of this problem, given two training sequences generated according to two unknown distributions P 1 and P 2 , one is tasked to classify a test sequence which is known to be generated according to either P 1 or P 2 . This problem can be thought of as an analogue of the binary hypothesis testing problem but in the present setting, the generating distributions are unknown. Due to finite sample considerations, we consider the second-order asymptotics (or dispersion-type) tradeoff between type-I and type-II error probabilities for tests which ensure that (i) the type-I error probability for all pairs of distributions decays exponentially fast and (ii) the type-II error probability for a particular pair of distributions is non-vanishing. We generalize our results to classification of multiple hypotheses with the rejection option.
Original Link