Random projection RBF nets for multidimensional density estimation
Ewa Skubalska-Rafajłowicz
International Journal of Applied Mathematics and Computer Science, Tome 18 (2008), p. 455-464 / Harvested from The Polish Digital Mathematics Library

The dimensionality and the amount of data that need to be processed when intensive data streams are observed grow rapidly together with the development of sensors arrays, CCD and CMOS cameras and other devices. The aim of this paper is to propose an approach to dimensionality reduction as a first stage of training RBF nets. As a vehicle for presenting the ideas, the problem of estimating multivariate probability densities is chosen. The linear projection method is briefly surveyed. Using random projections as the first (additional) layer, we are able to reduce the dimensionality of input data. Bounds on the accuracy of RBF nets equipped with a random projection layer in comparison to RBF nets without dimensionality reduction are established. Finally, the results of simulations concerning multidimensional density estimation are briefly reported.

Publié le : 2008-01-01
EUDML-ID : urn:eudml:doc:207899
@article{bwmeta1.element.bwnjournal-article-amcv18i4p455bwm,
     author = {Ewa Skubalska-Rafaj\l owicz},
     title = {Random projection RBF nets for multidimensional density estimation},
     journal = {International Journal of Applied Mathematics and Computer Science},
     volume = {18},
     year = {2008},
     pages = {455-464},
     zbl = {1155.93428},
     language = {en},
     url = {http://dml.mathdoc.fr/item/bwmeta1.element.bwnjournal-article-amcv18i4p455bwm}
}
Ewa Skubalska-Rafajłowicz. Random projection RBF nets for multidimensional density estimation. International Journal of Applied Mathematics and Computer Science, Tome 18 (2008) pp. 455-464. http://gdmltest.u-ga.fr/item/bwmeta1.element.bwnjournal-article-amcv18i4p455bwm/

[000] Achlioptas D. (2001). Database friendly random projections, Proceedings of the 20th ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems, Santa Barbara, CA, USA, pp. 274-281.

[001] Ailon N. and Chazelle B. (2006). Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform, Proceedings of the 38th Annual ACM Symposium on Theory of Computing, Seattle, WA, USA, pp. 557-563. | Zbl 1301.68232

[002] Arriaga R. and Vempala S.(1999). An algorithmic theory of learning: Robust concepts and random projection, Proceedings of the 40th Annual Symposium on Foundations of Computer Science, New York, NY, USA, pp. 616-623. | Zbl 1095.68092

[003] Bishop C. M. (1994). Novelty detection and neural-network validation, IEE Proceedings - Vision Image and Signal Processing, 141:217-222.

[004] Bishop C.M. (1995). Neural Networks for Pattern Recognition, Oxford University Press, Oxford. | Zbl 0868.68096

[005] Bowman A.W. (1995). An alternative method of cross-validation for the smoothing of density estimates, Biometrika 71(2): 353-360.

[006] Broomhead D. and Lowe D. (1988). Multivariable functional interpolation and adaptive networks, Complex Systems 2(11): 321-323. | Zbl 0657.68085

[007] Buhmann M. D. (1988). Radial Basis Functions: Theory and Implementations, Cambridge University Press, Cambridge. | Zbl 1038.41001

[008] Chen S., Cowan C.F.N. and Grant P.M. (1991). Orthogonal least squares learning algorithm for radial basisfunction networks, IEEE Transactions on Neural Networks 2(2): 302-307.

[009] Chen S., Hong X. and Harris C.J. (2004). Sparse kernel density construction using orthogonal forward regression with leave-one-out test score and local regularization, IEEE Transactions on Systems, Man, and Cybernetics, Part B, 34(4): 1708-1717.

[010] Dasgupta S. and Gupta A. (2003). An elementary proof of a theorem of Johnson and Lindenstrauss, Random Structures and Algorithms 22(1): 60-65. | Zbl 1018.51010

[011] Devroye L. and Györfi L. (1985). Nonparametric Density Estimation. The L1 View. Wiley, New York, NY. | Zbl 0546.62015

[012] Devroye L., Györfi L. and Lugosi G. (1996). Probabilistic Theory of Pattern Recognition, Springer-Verlag, New York, NY. | Zbl 0853.68150

[013] Frankl P. and Maehara H. (1987). The Johnson-Lindenstrauss lemma and the sphericity of some graphs, Journal of Combinatorial Theory A 44(3): 355-362. | Zbl 0675.05049

[014] Gertler J.J. (1998). Fault Detection and Diagnosis in Engineering Systems, Marcel Dekker, New York, NY.

[015] Guh R.(2005). A hybrid learning based model for on-line detection and analysis of control chart patterns, Computers and Industrial Engineering 49(1): 35-62.

[016] Holmström L. and Hämäläinen A. (1993). The self-organizing reduced kernel density estimator, Proceedings of the 1993 IEEE International Conference on Neural Networks, San Francisco, CA, USA, 1: 417-421.

[017] Haykin S. (1999). Neural Networks. A Comprehensive Foundation, 2nd Ed., Prentice-Hall, Upper Saddle River, NJ. | Zbl 0934.68076

[018] Indyk P. and Motwani R. (1998). Approximate nearest neighbors: Towards removing the curse of dimensionality, Proceedings of the 30th Annual ACM Symposium on Theory of Computing, Dallas, TX, USA, pp. 604-613. | Zbl 1029.68541

[019] Indyk P. and Naor A.(2006). Nearest neighbor preserving embeddings, ACM Transactions on Algorithms (to appear). | Zbl 1192.68748

[020] Johnson W. B. and Lindenstrauss J. (1984). Extensions of Lipshitz mapping into Hilbert space, Contemporary Mathematics 26: 189-206. | Zbl 0539.46017

[021] Jones M.C., Marron J.S. and Sheather S.J. (1996). A brief survey of bandwidth selection for density estimation, Journal of the American Statistical Association 91(433): 401-407. | Zbl 0873.62040

[022] Karayiannis N.B. (1999). Reformulated radial basis neural networks trained by gradient descent, IEEE Transactions on Neural Networks 10(3): 657-671.

[023] Krzyżak A. (2001). Nonlinear function learning using optimal radial basis function networks, Journal on Nonlinear Analysis 47(1): 293-302. | Zbl 1042.68651

[024] Krzyżak A., Linder T.and Lugosi G. (2001). Nonparametric estimation and classification using radial basis function nets and empirical risk minimization, IEEE Transactions on Neural Networks 7(2): 475-487.

[025] Krzyżak A. and Niemann H. (2001). Convergence and rates of convergence of radial basis functions networks in function learning, Journal on Nonlinear Analysis 47(1): 281-292. | Zbl 1042.68652

[026] Krzyżak A.and Skubalska-Rafajłowicz E. (2004). Combining space-filling curves and radial basis function networks, Artificial Intelligence and Soft Computing ICAISC 2004. 7th International conference. Zakopane, Poland, Lecture Notes in Artificial Intelligence, 3070: 229-234, Springer-Verlag, Berlin. | Zbl 1058.68574

[027] Korbicz J., Kościelny J. M., Kowalczuk Z. and Cholewa W. (Eds) (2004). Fault diagnosis. Models, Artificial Intelligence, Applications, Springer-Verlag, Berlin. | Zbl 1074.93004

[028] Leonard J. A., and Kramer M. A. (1990). Classifying process behaviour with neural networks: Strategies for improved training and generalization, Proceedings of the American Control Conference, San Diego, CA, USA, pp. 2478-2483.

[029] Leonard J.A., and Kramer M.A. (1991). Radial basis networks for classifying process faults, IEEE Control Systems Magazine 11(3): 31-38

[030] Li Y., Pont M. J. and Jones N.B. (2002). Improving the performance of radial basis function classifiers in condition monitoring and fault diagnosis applications where ‘unknown' faults may occur, Pattern Recognition Letters 23(5): 569-577. | Zbl 1012.68173

[031] Li P., Hastie T.J. and Church K.W. (2007). Nonlinear Estimators and tail bounds for dimension reduction in l1 using Cauchy random projections, The Journal of Machine Learning Research 8(10): 2497-2532.

[032] Li P., Hastie T.J. and Church K.W. (2006). Sub-Gaussian random projections, Technical report, Stanford University.

[033] Magdon-Ismail M. and Atiya A. (2002). Density estimation and random variate generation using multilayer networks, IEEE Transactions on Neural Networks 13(3): 497-520.

[034] Moody J. and Darken C.J. (1989). Fast learning in networks of locally tuned processing units, Neural Computation 1(2): 281-294.

[035] Patton R.J.(1994). Robust model-based fault diagnosis: The state of the art, Proceedings of the IFAC Symposium on Fault Detection Supervision and Safety of Technical Processes, Espoo, Finland, pp. 1-24.

[036] Patton R. J., Chen J. and Benkhedda H.(2000). A study on neurofuzzy systems for fault diagnosis, International Journal of Systems Science 31(11): 1441-1448. | Zbl 1080.93600

[037] Parzen E. (1962 ). On estimation of a probability density function and mode, Annals of Mathematical Statistics 33(3): 1065-1076. | Zbl 0116.11302

[038] Poggio T. and Girosi F. (1990). Networks for approximation and learning, Proceedings of the IEEE 78(9): 484-1487. | Zbl 1226.92005

[039] Powell M.J.D. (1987). Radial basis functions for multivariable interpolation: A review, in (J.C. Mason, M.G. Cox, Eds.) Algorithms for Approximation, Clarendon Press, Oxford, pp. 143-167.

[040] Rafajłowicz E. (2006). RBF nets in fault localization, 8th International Conference on Artificial Intelligence and Soft Computing - ICAISC 2006. Zakopane, Poland, LNCS, Springer-Verlag, Berlin/Heidelberg, 4029/2006: 113-122.

[041] Rafajłowicz E., Skubalska-Rafajłowicz E. (2003). RBF nets based on equidistributed points, Proceedings of 9th IEEE International Conference: Methods and Models in Automation and Robotics MMAR 2003, Szczecin, Poland, 2: 921-926.

[042] Roberts S. (2000). Extreme value statistics for novelty detection in biomedical data processing, IEE Proceedings: Science, Measurement and Technology 147 (6): 363-367.

[043] Schlorer H. and Hartman U. (1992). Mapping neural networks derived from the Parzen window estimator, Neural Networks 5(6): 903-909.

[044] Skubalska-Rafajłowicz E. (2000). On using space-filling curves and vector quantization for constructing multidimensional control charts, Proceedings of the 5th on Conference Neural Network and Soft Computing, Zakopane, Poland, pp. 162-167.

[045] Skubalska-Rafajłowicz E. (2006a). RBF neural network for probability density function estimation and detecting changes in multivariate processes, 8th International Conference: Artificial Intelligence and Soft Computing - ICAISC 2006. Zakopane, Poland, LNCS, Springer-Verlag, Berlin/Heidelberg 4029/2006: 133-141.

[046] Skubalska-Rafajłowicz E. (2006b). Self-organizing RBF neural network for probability density function estimation, Proceedings of the 12th IEEE International Conference on Methods and Models in Automation and Robotics, Międzyzdroje, Poland, pp. 985-988.

[047] Specht D.F. (1990). Probabilistic neural networks, Neural Networks 3(1): 109-118.

[048] Vempala S. (2004). The Random Projection Method, American Mathematical Society, Providence, RI. | Zbl 1058.68063

[049] Wettschereck D. and Dietterich T. (1992). Improving the performance of radial basis function networks by learning center locations, in (B. Spatz, Ed.) Advances in Neural Information Processing Systems, Morgan Kaufmann, San Mateo, CA, Vol. 4, pp. 1133-1140.

[050] Willsky A. S. (1976). A survey of design methods for failure detection in dynamic systems, Automatica 12(6): 601-611. | Zbl 0345.93067

[051] Xu L., Krzyżak A. and Yuille A. (1994). On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size, Neural Networks 7(4): 609-628. | Zbl 0817.62031

[052] Yee P. V. and Haykin S. (2001). Regularized Radial Basis Function Networks: Theory and Applications, John Wiley, New York, NY.

[053] Yin H. and Allinson N.M.(2001). Self-organising mixture networks for probability density estimation, IEEE Transactions on Neural Networks 12(2): 405-411.

[054] Zorriassatine F., Tannock J.D.T.and O‘Brien C.(2003). Using novelty detection to identify abnormalities caused by mean shifts in bivariate processes, Computers and Industrial Engineering 44(3): 385-408.