We consider likelihood-based asymptotic inference for a p-dimensional parameter θ of an identifiable parametric model with singular information matrix of rank p-1 at θ=θ* and likelihood differentiable up to a specific order. We derive the asymptotic distribution of the likelihood ratio test statistics for the simple null hypothesis that θ=θ* and of the maximum likelihood estimator (MLE) of θ when θ=θ*. We show that there exists a reparametrization such that the MLE of the last p-1 components of θ converges at rate $O_{p}\left( n^{-1/2}\right)$ . For the first component θ1 of θ the rate of convergence depends on the order s of the first non-zero partial derivative of the log-likelihood with respect to θ1 evaluated at θ*. When s is odd the rate of convergence of the MLE of θ1 is $O_{p}\left(n^{-1/2s}\right)$ . When s is even, the rate of convergence of the MLE of $\left|\theta_{1}-\theta_{1}^{\ast}\right|$ is $O_{p}\left(n^{-1/2s}\right)$ and, moreover, the asymptotic distribution of the sign of the MLE of θ1-θ1* is non-standard. When p=1 it is determined by the sign of the sum of the residuals from the population least-squares regression of the (s+1)th derivative of the individual contributions to the log-likelihood on their derivatives of order s. For p>1, it is determined by a linear combination of the sum of residuals of a multivariate population least-squares regression involving partial and mixed derivatives of the log-likelihood of a specific order. Thus although the MLE of |θ1-θ1*| has a uniform rate of convergence of $O_{p}\left(n^{-1/2s}\right)$ , the uniform convergence rate for the MLE of θ1 in suitable shrinking neighbourhoods of θ1* is only $O_{p}\left(n^{-1/\left(2s+2\right)}\right)$ .