In the regression model with errors in variables, we observe n i.i.d. copies of (Y, Z) satisfying Y=fθ0(X)+ξ and Z=X+ɛ involving independent and unobserved random variables X, ξ, ɛ plus a regression function fθ0, known up to a finite dimensional θ0. The common densities of the Xi’s and of the ξi’s are unknown, whereas the distribution of ɛ is completely known. We aim at estimating the parameter θ0 by using the observations (Y1, Z1), …, (Yn, Zn). We propose an estimation procedure based on the least square criterion $\tilde{S}_{\theta^{0},g}(\theta)=\mathbb{E}_{\theta^{0},g}[((Y-f_{\theta}(X))^{2}w(X)]$ where w is a weight function to be chosen. We propose an estimator and derive an upper bound for its risk that depends on the smoothness of the errors density pɛ and on the smoothness properties of w(x)fθ(x). Furthermore, we give sufficient conditions that ensure that the parametric rate of convergence is achieved. We provide practical recipes for the choice of w in the case of nonlinear regression functions which are smooth on pieces allowing to gain in the order of the rate of convergence, up to the parametric rate in some cases. We also consider extensions of the estimation procedure, in particular, when a choice of wθ depending on θ would be more appropriate.