The principle of maximum entropy, together with some generalizations, is interpreted as a heuristic principle for the generation of null hypotheses. The main application is to $m$-dimensional population contingency tables, with the marginal totals given down to dimension $m - r$ ("restraints of the $r$th order"). The principle then leads to the null hypothesis of no "$r$th-order interaction." Significance tests are given for testing the hypothesis of no $r$th-order or higher-order interaction within the wider hypothesis of no $s$th-order or higher-order interaction, some cases of which have been treated by Bartlett and by Roy and Kastenbaum. It is shown that, if a complete set of $r$th-order restraints are given, then the hypothesis of the vanishing of all $r$th-order and higher-order interactions leads to a unique set of cell probabilities, if the restraints are consistent, but not only just consistent. This confirms and generalizes a recent conjecture due to Darroch. A kind of duality between maximum entropy and maximum likelihood is proved. Some relationships between maximum entropy, interactions, and Markov chains are proved.