The classical theory of rank-based inference is essentially limited to univariate linear models with independent observations. The objective of this paper is to illustrate some recent extensions of this theory to time-series problems (serially dependent observations) in a multivariate setting (multivariate observations) under very mild distributional assumptions (mainly, elliptical symmetry; for some of the testing problems treated below, even second-order moments are not required). After a brief presentation of the invariance principles that underlie the concepts of ranks to be considered, we concentrate on two examples of practical relevance: (1) the multivariate Durbin–Watson problem (testing against autocorrelated noise in a linear model context) and (2) the problem of testing the order of a vector autoregressive model, testing VAR(p0) against VAR(p0+1) dependence. These two testing procedures are the building blocks of classical autoregressive order-identification methods. Based either on pseudo-Mahalanobis (Tyler) or on hyperplane-based (Oja and Paindaveine) signs and ranks, three classes of test statistics are considered for each problem: (1) statistics of the sign-test type, (2) Spearman statistics and (3) van der Waerden (normal score) statistics. Simulations confirm theoretical results about the power of the proposed rank-based methods and establish their good robustness properties.