G. Edelman, O. Sporns and G. Tononi ont introduit la complexité neuronale d'une famille de variables aléatoires, définie comme une certaine moyenne de l'information mutuelle de ses sous-familles. On montre ici que leur choix des poids satisfait deux propriétés naturelles: l'invariance par permutations et l'additivité. Nous appelons toute fonctionnelle satisfaisant ces deux propriétés une intrication. Nous classifions toutes les intrications en termes de mesures de probabilité sur l'intervalle unité et nous étudions le taux de croissance du maximum de l'intrication quand la taille du système tend vers l'infini. Pour un système de taille fixée, nous montrons que les maximiseurs ont un petit support et que les systèmes échangeables ont une petite intrication. En particulier, maximiser l'intrication mène à une rupture spontanée de symétrie et il n'y a pas d'unicité.
G. Edelman, O. Sporns and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely invariance under permutations and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that maximizers have small support and exchangeable systems have small intricacy. In particular, maximizing intricacy leads to spontaneous symmetry breaking and lack of uniqueness.
@article{AIHPB_2012__48_2_343_0, author = {Buzzi, J. and Zambotti, L.}, title = {Mean mutual information and symmetry breaking for finite random fields}, journal = {Annales de l'I.H.P. Probabilit\'es et statistiques}, volume = {48}, year = {2012}, pages = {343-367}, doi = {10.1214/11-AIHP416}, mrnumber = {2954258}, zbl = {1259.94032}, language = {en}, url = {http://dml.mathdoc.fr/item/AIHPB_2012__48_2_343_0} }
Buzzi, J.; Zambotti, L. Mean mutual information and symmetry breaking for finite random fields. Annales de l'I.H.P. Probabilités et statistiques, Tome 48 (2012) pp. 343-367. doi : 10.1214/11-AIHP416. http://gdmltest.u-ga.fr/item/AIHPB_2012__48_2_343_0/
[1] Exchangeability and related topics. In Ecole d'été de probabilités de Saint-Flour, XIII 1-198. Lecture Notes in Math. 1117. Springer, Berlin, 1985. | MR 883646 | Zbl 0562.60042
.[2] Complexity, contingency and criticality. Proc. Natl. Acad. Sci. USA 92 (1995) 6689-6696.
and .[3] Neural complexity and structural connectivity. Phys. Rev. E 79 (2009) 051914. | MR 2551416
, and .[4] How to define complexity in physics and why. In Complexity, Entropy and the Physics of Information, Vol. VIII. W. Zurek (Ed.). Addison-Wesley, Redwood City, 1990.
.[5] Random Fragmentation and Coagulation Processes. Cambridge Univ. Press, Cambridge, 2006. | Zbl 1107.60002
.[6] Approximate maximizers of intricacy functionals. Probab. Theory Related Fields. To appear. Available at http://arxiv.org/abs/0909.2120. | MR 2948682 | Zbl 1261.94019
and .[7] Elements of Information Theory. John Wiley & Sons, Hoboken, NJ, 2006. | Zbl 1140.94001
and .[8] Inferring statistical complexity. Phys. Rev. Lett. 63 (1989) 105-109. | MR 1001514
and .[9] A topological approach to neural complexity. Phys. Rev. E 71 (2005), 016114. | MR 2139320
, , and .[10] Degeneracy and complexity in biological systems. Proc. Natl. Acad. Sci. USA 98 (2001) 13763-13768.
and .[11] Simple lessons from complexity. Science 284 (1999) 87-89.
and .[12] Entropy. Princeton Univ. Press, Princeton, NJ, 2003. | MR 2035814 | Zbl 1187.00001
, and .[13] Polymatroidal dependence structure of a set of random variables. Information and Control 39 (1978) 55-72. | MR 514262 | Zbl 0388.94006
.[14] Nonnegative entropy measures of multivariate symmetric correlations. Information and Control 36 (1978) 133-156. | MR 464499 | Zbl 0367.94041
.[15] Analytical description of the evolution of neural networks: Learning rules and complexity. Biol. Cybern. 81 (1999) 169-176. | Zbl 0929.92004
and .[16] Characterizing functional hippocampal pathways in a brain-based device as it solves a spatial memory task. Proc. Natl. Acad. Sci. USA 102 (2005) 2111-2116.
, , and .[17] Information inequalities for joint distributions, with interpretations and applications. IEEE Trans. Inform. Theory 56 (2010) 2699-2713. | MR 2683430
and .[18] Theories and measures of consciousness: An extended framework. Proc. Natl. Acad. Sci. USA 103 (2006) 10799-10804.
, , and .[19] Models of consciousness. Scholarpedia 2 (2007) 1328.
.[20] Dynamical complexity in small-world networks of spiking neurons. Phys. Rev. E 78 (2008) 041924. | MR 2529582
.[21] Connectivity and complexity: The relationship between neuroanatomy and brain dynamics. Neural Netw. 13 (2000) 909-922.
, and .[22] Networks analysis, complexity, and brain function. Complexity 8 (2002) 56-60. | MR 1969099
.[23] Complexity. Scholarpedia 2 (2007) 1623.
.[24] Spin Glasses: A Challenge for Mathematicians. Springer, Berlin, 2003. | MR 1993891 | Zbl 1033.82002
.[25] Nonnegative entropy measures of multivariate symmetric correlations. Information and Control 36 (1978) 133-156. | MR 464499 | Zbl 0367.94041
.[26] A measure for brain complexity: Relating functional segregation and integration in the nervous system. Proc. Natl. Acad. Sci. USA 91 (1994) 5033-5037.
, and .[27] A complexity measure for selective matching of signals by the brain. Proc. Natl. Acad. Sci. USA 93 (1996) 3422-3427.
, and .[28] Measures of degeneracy and redundancy in biological networks. Proc. Natl. Acad. Sci. USA 96 (1999), 3257-3262.
, and .