Examples are provided of Markovian martingales that: (i) converge in distribution but fail to converge in probability; (ii) converge in probability but fail to converge almost surely. This stands in sharp contrast to the behavior of series with independent increments, and settles, in the negative, a question raised by Loeve in 1964. Subsequently, it is proved that a discrete, real-valued Markov-chain with stationary transition probabilities, which is at the same time a martingale, converges almost surely if it converges in distribution, provided the limiting measure has a mean. This fact does not extend to non-discrete processes.