Akaike’s information criterion (AIC) is a measure of the quality of a statistical model for a given set of data. We can determine the best statistical model for a particular data set by the minimization based on the AIC. Since it is difficult to find the best statistical model from a set of candidates by this minimization in practice, stepwise methods, which are local search algorithms, are commonly used to find a better statistical model though it may not be the best. We formulate this AIC minimization as a mixed integer nonlinear programming problem and propose a method to find the best statistical model. In particular, we propose ways to find lower and upper bounds and a branching rule for this minimization. We then combine them with SCIP, which is a mathematical optimization software and a branch-andbound framework. We show that the proposed method can provide the best statistical model based on AIC for small-sized or medium-sized benchmark data sets in UCI Machine Learning Repository. Furthermore, we show that this method can find good quality solutions for large-sized benchmark data sets.