Let {Xn,n≥0} be a Markov chain on a general state space ${\mathcal{X}}$ with transition probability P and stationary probability π. Suppose an additive component Sn takes values in the real line R and is adjoined to the chain such that {(Xn,Sn),n≥0} is a Markov random walk. In this paper, we prove a uniform Markov renewal theorem with an estimate on the rate of convergence. This result is applied to boundary crossing problems for {(Xn,Sn),n≥0}. To be more precise, for given b≥0, define the stopping time τ=τ(b)=inf {n:Sn>b}. When a drift μ of the random walk Sn is 0, we derive a one-term Edgeworth type asymptotic expansion for the first passage probabilities Pπ{τπ{τmπ denotes the probability under the initial distribution π. When μ≠0, Brownian approximations for the first passage probabilities with correction terms are derived. Applications to sequential estimation and truncated tests in random coefficient models and first passage times in products of random matrices are also given.