We propose a new training algorithm for feedforward supervised neural networks based on a primal-dual interior-point method for nonlinear programming. Specifically, we consider a one-hidden layer network architecture where the error function is defined by the L/sub 2/ norm and the activation function of the hidden and output neurons is nonlinear. Computational results are given for odd parity problems with 2, 3, and 5 inputs respectively. Approximation of a nonlinear dynamical system is also discussed.