Consider a regression model $Y_i=x'_i\beta+R_i$, i=1,\cdots,n, where ${R_i}$ are i.i.d. with c.d.f., F; $x_i\inR^p$ and $\beta\inR^p$. Let $\hat{\beta}$ be a M-estimator defined using kernel, $\psi$; let $\hat{F}_n(x)$ denote the empiric distribution of the residuals, $Y_i-x'_i\hat{\beta}$ and let $\hat{F}^\ast_n$ be the empiric c.d.f. of the errors, ${R_i}$. Under suitable smoothness conditions on $\psi$, F, and the density F' = f and conditions requiring essentially that ${x_i}$ behave like a random sample from some distribution in $R_p$, it is show that, for fixed x, $\sqrt n (\hat{F}_n(x) - \hat{F}^\ast_n(x)-H_n(x))-\frac{P}{\sqrt{n}} g(x) \rightarrow_p 0$, where $g(x) = af(x)\psi(x) + bf'(x)$ and $H_n(x) = (1/nd)f(x)\sum^n_{i=1}\psi(R_i)$ if the design has a constant term [and $H_n(x)$ vanishes otherwise]. A tightness result shows that if $p/\sqrt n\rigtharrow c$ $\sqrt n(\hat{F}_n(x)-F(x))$ converges weakly to a Gaussian process with drift given by the bias term cg(x), and covariance function strongly affected by $H_n(x)$ and different from that for the usual Brownian bridge. In the course of the proof, an expansion for the fitted values, $x'_i\hat{\beta}$, is obtained, with error $O_p(p^{11/4}ln^2n/n^2)=o_p (1/\sqrt{n}) if p^2/n$ is bounded.