We present a general approach to statistical problems with criteria based on probabilities of large deviations. Our main idea, which originates from similarity in the definitions of the large-deviation principle (LDP) and weak convergence, is to develop a large-deviation analogue of asymptotic decision theory. We introduce the concept of the LPD for sequences of statistical experiments, which parallels the concept of weak convergence of experiments, and prove that, in analogy with Le Cam's minimax theorem, the LPD provides an asymptotic lower bound for the sequence of appropriately defined minimax risks. We also show that the bound is tight and give a method of constructing decisions whose asymptotic risk is arbitrarily close to the bound. The construction is further specified for hypothesis testing and estimation problems.