We give a review on the properties and applications of M-estimators with redescending score function. For regression analysis, some of these redescending M-estimators can attain the maximum breakdown point which is possible in this setup. Moreover, some of them are the solutions of the problem of maximizing the efficiency under bounded influence function when the regression coefficient and the scale parameter are estimated simultaneously. Hence redescending M-estimators satisfy several outlier robustness properties. However, there is a problem in calculating the redescending M-estimators in regression. While in the location-scale case, for example, the Cauchy estimator has only one local extremum this is not the case in regression. In regression there are several local minima reflecting several substructures in the data. This is the reason that the redescending M-estimators can be used to detect substructures in data, i.e. they can be used in cluster analysis. If the starting point of the iteration to calculate the estimator is coming from the substructure then the closest minimum corresponds to this substructure. This property can be used to construct an edge and corner preserving smoother for noisy images so that there are applications in image analysis as well.
@article{bwmeta1.element.bwnjournal-article-doi-10_7151_dmps_1046, author = {Christine H. M\"uller}, title = {Redescending M-estimators in regression analysis, cluster analysis and image analysis}, journal = {Discussiones Mathematicae Probability and Statistics}, volume = {24}, year = {2004}, pages = {59-75}, zbl = {1053.62081}, language = {en}, url = {http://dml.mathdoc.fr/item/bwmeta1.element.bwnjournal-article-doi-10_7151_dmps_1046} }
Christine H. Müller. Redescending M-estimators in regression analysis, cluster analysis and image analysis. Discussiones Mathematicae Probability and Statistics, Tome 24 (2004) pp. 59-75. http://gdmltest.u-ga.fr/item/bwmeta1.element.bwnjournal-article-doi-10_7151_dmps_1046/
[000] [1] D.F. Andrews, P.J. Bickel, F.R. Hampel, P.J. Huber, W.H. Rogers and J.W. Tukey, Robust Estimates of Location. Survey and Advances, Princeton University Press, Princeton 1972. | Zbl 0254.62001
[001] [2] O. Arslan, A simple test to identify good solutions of redescending M estimating equations for regression, Developments in Robust Statistics, Proceedings of ICORS 2001, R. Dutter, U. Gather, P.J. Rousseeuw and P. Filzmoser, (Eds.), (2003), 50-61. | Zbl 1135.62351
[002] [3] T. Bednarski and Ch.H. Müller, Optimal bounded influence regression and scale M-estimators, Statistics 35 (2001), 349-369. | Zbl 0995.62028
[003] [4] P.J. Bickel, Quelque aspects de la statistique robuste, In École d'Été de Probabilités de St. Flour. Springer Lecture Notes in Math. 876 (1981), 1-72.
[004] [5] P.J. Bickel, Robust regression based on infinitesimal neighbourhoods, Ann. Statist. 12 (1984), 1349-1368. | Zbl 0567.62051
[005] [6] H. Chen and P. Meer, Robust computer vision through kernel density estimation, ECCV 2002, LNCS 2350, A. Heyden et al. (Eds.), Springer, Berlin (2002), 236-250. | Zbl 1034.68586
[006] [7] H. Chen, P. Meer and D.E. Tyler, Robust regression for data with multiple structures, 2001 IEEE Conference on Computer Vision and Pattern Recognition, vol. I, Kauai, HI, (2001), 1069-1075.
[007] [8] C.K. Chu, I.K. Glad, F. Godtliebsen and J.S. Marron, Edge-preserving smoothers for image processing, J. Amer. Statist. Assoc. 93, (1998), 526-541. | Zbl 0954.62115
[008] [9] B.R. Clarke, Uniqueness and Frechét differentiability of functional solutions to maximum likelihood type equations, Ann. Statist. 4 (1983), 1196-1205. | Zbl 0541.62023
[009] [10] B.R. Clarke, Asymptotic theory for description of regions in which Newton-Raphson iterations converge to location M-estimators, J. Statist. Plann. Inference 15 (1986), 71-85. | Zbl 0657.62049
[010] [11] J.R. Collins, Robust estimation of a location parameter in the presence of asymmetry, Ann. Statist. 4 (1976), 68-85. | Zbl 0351.62035
[011] [12] J.B. Copas, On the unimodality of the likelihood for the Cauchy distribution, Biometrika 62 (1975), 701-704. | Zbl 0321.62037
[012] [13] D.L. Donoho. and J.P. Huber, The notion of breakdown point, P.J. Bickel, K.A. Doksum and J.L. Hodges, Jr., Eds., A Festschrift for Erich L. Lehmann, Wadsworth, Belmont, CA, (1983), 157-184.
[013] [14] S.P. Ellis and S. Morgenthaler, Leverage and breakdown in L1 regression, J. Amer. Statist. Assoc. 87 (1992), 143-148. | Zbl 0781.62101
[014] [15] D.A. Freedman and P. Diaconis, On inconsistent M-estimators, Ann. Statist. 10 (1982), 454-461. | Zbl 0496.62034
[015] [16] G. Gabrielsen, On the unimodality of the likelihood for the Cauchy distribution: Some comments, Biometrika 69 (1982), 677-678. | Zbl 0499.62031
[016] [17] F.R. Hampel, Optimally bounding the gross-error-sensitivity and the influence of position in factor space, Proceedings of the ASA Statistical Computing Section, ASA, Washington, D.C., (1978), 59-64.
[017] [18] F.R. Hampel, E.M. Ronchetti, P.J. Rousseeuw and W.A. Stahel, Robust Statistics - The Approach Based on Influence Functions, John Wiley, New York 1986. | Zbl 0593.62027
[018] [19] W. Härdle and T. Gasser, Robust nonparametric function fitting, J. R. Statist. Soc. B 46 (1984), 42-51. | Zbl 0543.62034
[019] [20] X. He, J. Jurecková, R. Koenker and S. Portnoy, Tail behavior of regression estimators and their breakdown points, Econometrica 58 (1990), 1195-1214. | Zbl 0745.62030
[020] [21] X. He, D.G. Simpson and G. Wang, Breakdown points of t-type regression estimators, Biometrika 87 (2000), 675-687. | Zbl 1120.62320
[021] [22] C. Hennig, Regression fixed point clusters: motivation, consistency and simulations, Preprint 2000-02, Fachbereich Mathematik, Universität Hamburg 2000.
[022] [23] C. Hennig, Clusters, outliers, and regression: Fixed point clusters, Journal of Multivariate Analysis. 86/1 (2003), 183-212. | Zbl 1020.62051
[023] [24] M. Hillebrand, On robust corner-preserving smoothing in image processing, Ph.D. thesis at the Carl von Ossietzky University Oldenburg, Germany 2002.
[024] [25] M. Hillebrand and Ch.H. Müller, On consistency of redescending M-kernel smoothers, Submitted 2002.
[025] [26] P.J. Huber, Minimax aspects of bounded-influence regression (with discussion), J. Amer. Statist. Assoc. 78 (1983), 66-80. | Zbl 0514.62073
[026] [27] J. Jurecková and P.K. Sen, Robust Statistical Procedures. Asymptotics and Interrelations, Wiley, New York 1996. | Zbl 0862.62032
[027] [28] W.S. Krasker, Estimation in linear regression models with disparate data points, Econometrica 48 (1980), 1333-1346. | Zbl 0467.62096
[028] [29] V. Kurotschka and Ch.H. Müller, Optimum robust estimation of linear aspects in conditionally contaminated linear models, Ann. Statist. 20 (1992), 331-350. | Zbl 0792.62024
[029] [30] K.L. Lange, R.J.A. Little and J.M.G. Taylor, Robust statistical modeling using the t distribution J. Amer. Statist. Assoc. 84 (1989), 881-896.
[030] [31] R.A. Maronna, O.H. Bustos and V.J. Yohai, Bias- and efficiency-robustness of general M-estimators for regression with random carriers, Smoothing Techniques for Curve Estimation (T. Gasser and M. Rosenblatt, eds.) Springer, Berlin, Lecture Notes in Mathematics 757 (1979), 91-116. | Zbl 0416.62050
[031] [32] I. Mizera, On consistent M-estimators: tuning constants, unimodality and breakdown, Kybernetika 30 (1994), 289-300. | Zbl 0815.62013
[032] [33] I. Mizera, Weak continuity of redescending M-estimators of location with an unbounded objective function, Tatra Mountains Math. Publ. 7 (1996), 343-347. | Zbl 0919.62024
[033] [34] I. Mizera and Ch.H. Müller, Breakdown points and variation exponents of robust M-estimators in linear models, Ann. Statist. 27 (1999), 1164-1177. | Zbl 0959.62029
[034] [35] I. Mizera and Ch.H. Müller, Breakdown points of Cauchy regression-scaleestimators, Stat. & Prob. Letters 57 (2002), 79-89. | Zbl 1039.62025
[035] [36] S. Morgenthaler, Fitting redescending M-estimators in regression, Robust Regression, H.D. Lawrence and S. Arthur, (Eds.), Dekker, New York (1990), 105-128. | Zbl 0719.62081
[036] [37] Ch.H. Müller, Optimal designs for robust estimation in conditionally contaminated linear models, J. Statist. Plann. Inference. 38 (1994), 125-140. | Zbl 0803.62066
[037] [38] Ch.H. Müller, Breakdown points for designed experiments, J. Statist. Plann. Inference, 45 (1995), 413-427. | Zbl 0827.62066
[038] [39] Ch.H. Müller, Optimal breakdown point maximizing designs, Tatra Mountains Math. Publ. 7, (1996), 79-85. | Zbl 0920.62093
[039] [40] Ch.H. Müller, Robust Planning and Analysis of Experiments, Springer, New York, Lecture Notes in Statistics 124 (1997).
[040] [41] Ch.H. Müller, On the use of high breakdown point estimators in the image analysis, Tatra Mountains Math. Publ. 17 (1999), 283-293. | Zbl 1067.62546
[041] [42] Ch.H. Müller, Robust estimators for estimating discontinuous functions, Metrika 55 (2002a), 99-109. | Zbl 1320.62077
[042] [43] Ch.H. Müller, Comparison of high-breakdown-point estimators for image denoising, Allg. Stat. Archiv 86 (2002b), 307-321. | Zbl 1123.62310
[043] [44] Ch.H. Müller and T. Garlipp, Simple consistent cluster methods based on redescending M-estimators with an application to edge identification in images, To appear in Journal of Multivariate Analysis, (2002). | Zbl 1062.62114
[044] [45] P. Qiu, Nonparametric estimation of jump surface, The Indian Journal of Statistics 59, Series A, (1997), 268-294. | Zbl 0886.62046
[045] [46] H. Rieder, Robust regression estimators and their least favorable contamination curves, Stat. Decis. 5 (1987), 307-336. | Zbl 0631.62035
[046] [47] H. Rieder, Robust Asymptotic Statistics, Springer, New York 1994.
[047] [48] B.W. Silverman, Density Estimation for Statistics and Data Analysis, Chapman and Hall, London 1986. | Zbl 0617.62042
[048] [49] S. Smith and J. Brady, SUSAN - a new approach to low level image processing, International Journal of Computer Vision 23 (1997), 45-78.
[049] [50] R.H. Zamar, Robust estimation in the errors-in-variables model, Biometrika 76 (1989), 149-160. | Zbl 0664.62030