Your new post is loading...

Scooped by
Rick Quax

How should one quantify the strength of association between two random variables without bias for relationships of a specific form? Despite its conceptual simplicity, this notion of statistical “equitability” has yet to receive a definitive mathematical formalization. Here we argue that equitability is properly formalized by a selfconsistency condition closely related to Data Processing Inequality. Mutual information, a fundamental quantity in information theory, is shown to satisfy this equitability criterion. These findings are at odds with the recent work of Reshef et al. [Reshef DN, et al. (2011) Science 334(6062):1518–1524], which proposed an alternative definition of equitability and introduced a new statistic, the “maximal information coefficient” (MIC), said to satisfy equitability in contradistinction to mutual information. These conclusions, however, were supported only with limited simulation evidence, not with mathematical arguments. Upon revisiting these claims, we prove that the mathematical definition of equitability proposed by Reshef et al. cannot be satisfied by any (nontrivial) dependence measure. We also identify artifacts in the reported simulation evidence. When these artifacts are removed, estimates of mutual information are found to be more equitable than estimates of MIC. Mutual information is also observed to have consistently higher statistical power than MIC. We conclude that estimating mutual information provides a natural (and often practical) way to equitably quantify statistical associations in large datasets.
Via Ashish Umre

Scooped by
Rick Quax


Scooped by
Rick Quax

Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a tradeoff between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.

Rescooped by
Rick Quax
from Papers

There is growing evidence that for a range of dynamical systems featuring complex interactions between large ensembles of interacting elements, mutual information peaks at orderdisorder phase transitions. We conjecture that, by contrast, information flow in such systems will generally peak strictly on the disordered side of a phase transition. This conjecture is verified for a ferromagnetic 2D lattice Ising model with Glauber dynamics and a transfer entropybased measure of systemwide information flow. Implications of the conjecture are considered, in particular, that for a complex dynamical system in the process of transitioning from disordered to ordered dynamics (a mechanism implicated, for example, in financial market crashes and the onset of some types of epileptic seizures); information dynamics may be able to predict an imminent transition. Lionel Barnett, Joseph T. Lizier, Michael Harré, Anil K. Seth, and Terry Bossomaier "Information Flow in a Kinetic Ising Model Peaks in the Disordered Phase" Physical Review Letters 111, 177203 (2013) http://link.aps.org/doi/10.1103/PhysRevLett.111.177203
Via Complexity Digest

Scooped by
Rick Quax

The notion of closure plays a prominent role in systems theory where it is used to identify or define the system in distinction from its environment and to explain the autonomy of the system. Here, we present a quantitative measure, as opposed to the already existing qualitative notions, of closure. We shall elaborate upon the observation that cognitive systems can achieve informational closure by modeling their environment. Formally, then, a system is informationally closed if (almost) no information flows into it from the environment. A system that is independent from its environment trivially achieves informational closure. Simulations of coupled hidden Markov models demonstrate that informational closure can also be realized nontrivially by modeling or controlling the environment. Our analysis of systems that actively influence their environment to achieve closure then reveals interesting connections to the related notion of autonomy. This discussion will then call into question the systemenvironment distinction that seems so innocent to begin with. It turns out that the notion of autonomy depends crucially on whether, not just the state observables, but also the dynamical processes are attributed to either the system or the environment. In that manner, our conceptualization of informational closure also sheds light on other, more ambitious notions of closure, e.g. organizational closure, semantic closure, closure to efficient cause or operational closure, intended as a fundamental (defining) concept of life itself.

Scooped by
Rick Quax

The nature of distributed computation has often been described in terms of the component operations of universal computation: information storage, transfer and modification. We review the first complete framework that quantifies each of these individual information dynamics on a local scale within a system, and describes the manner in which they interact to create nontrivial computation where "the whole is greater than the sum of the parts". We describe the application of the framework to cellular automata, a simple yet powerful model of distributed computation. This is an important application, because the framework is the first to provide quantitative evidence for several important conjectures about distributed computation in cellular automata: that blinkers embody information storage, particles are information transfer agents, and particle collisions are information modification events. The framework is also shown to contrast the computations conducted by several wellknown cellular automata, highlighting the importance of information coherence in complex computation. The results reviewed here provide important quantitative insights into the fundamental nature of distributed computation and the dynamics of complex systems, as well as impetus for the framework to be applied to the analysis and design of other systems.

Scooped by
Rick Quax

Distributed computation in artificial life and complex systems is often described in terms of component operations on information: information storage, transfer and modification. Information modification remains poorly described however, with the popularlyunderstood examples of glider and particle collisions in cellular automata being only quantitatively identified to date using a heuristic (separable information) rather than a proper informationtheoretic measure. We outline how a recentlyintroduced axiomatic framework for measuring information redundancy and synergy, called partial information decomposition, can be applied to a perspective of distributed computation in order to quantify component operations on information. Using this framework, we propose a new measure of information modification that captures the intuitive understanding of information modification events as those involving interactions between two or more information sources. We also consider how the local dynamics of information modification in space and time could be measured, and suggest a new axiom that redundancy measures would need to meet in order to make such local measurements. Finally, we evaluate the potential for existing redundancy measures to meet this localizability axiom.

Rescooped by
Rick Quax
from Papers

It is notoriously difficult to predict the behaviour of a complex selforganizing system, where the interactions among dynamical units form a heterogeneous topology. Even if the dynamics of each microscopic unit is known, a real understanding of their contributions to the macroscopic system behaviour is still lacking. Here, we develop informationtheoretical methods to distinguish the contribution of each individual unit to the collective outofequilibrium dynamics. We show that for a system of units connected by a network of interaction potentials with an arbitrary degree distribution, highly connected units have less impact on the system dynamics when compared with intermediately connected units. In an equilibrium setting, the hubs are often found to dictate the longterm behaviour. However, we find both analytically and experimentally that the instantaneous states of these units have a shortlasting effect on the state trajectory of the entire system. We present qualitative evidence of this phenomenon from empirical findings about a social network of product recommendations, a protein–protein interaction network and a neural network, suggesting that it might indeed be a widespread property in nature. "The diminishing role of hubs in dynamical processes on complex networks" Quax R, Apolloni A and Sloot P.M.A. Journal of the Royal Society Interface, 10, 20130568, published 4 September 2013 http://dx.doi.org/10.1098/rsif.2013.0568
Via Complexity Digest

Scooped by
Rick Quax

In financial markets, participants locally optimize their profit which can result in a globally unstable state leading to a catastrophic change.

Rescooped by
Rick Quax
from Papers

We use a standard discretetime linear Gaussian model to analyze the information storage capability of individual nodes in complex networks, given the network structure and link weights. In particular, we investigate the role of two and threenode motifs in contributing to local information storage. We show analytically that directed feedback and feedforward loop motifs are the dominant contributors to information storage capability, with their weighted motif counts locally positively correlated to storage capability. We also reveal the direct local relationship between clustering coefficient(s) and information storage. These results explain the dynamical importance of clustered structure and offer an explanation for the prevalence of these motifs in biological and artificial networks. Information storage, loop motifs, and clustered structure in complex networks Joseph T. Lizier, Fatihcan M. Atay and Jürgen Jost Phys. Rev. E 86, 026110 (2012) http://dx.doi.org/10.1103/PhysRevE.86.026110
Via Complexity Digest

Rescooped by
Rick Quax
from CxBooks

The nature of distributed computation in complex systems has often been described in terms of memory, communication and processing. This thesis presents a complete informationtheoretic framework to quantify these operations on information (i.e. information storage, transfer and modification), and in particular their dynamics in space and time. The framework is applied to cellular automata, and delivers important insights into the fundamental nature of distributed computation and the dynamics of complex systems (e.g. that gliders are dominant information transfer agents). Applications to several important network models, including random Boolean networks, suggest that the capability for information storage and coherent transfer are maximised near the critical regime in certain orderchaos phase transitions. Further applications to study and design information structure in the contexts of computational neuroscience and guided selforganisation underline the practical utility of the techniques presented here. "The Local Information Dynamics of Distributed Computation in Complex Systems" Joseph T. Lizier (With foreword by Dr. Mikhail Prokopenko) Springer Theses, Springer: Berlin/Heidelberg, 2013. http://dx.doi.org/10.1007/9783642329524
Via Complexity Digest


Scooped by
Rick Quax

Thermodynamic entropy was initially proposed by Clausius in 1865. Since then it has been implemented in the analysis of different systems, and is seen as a promising concept to understand the evolution of open systems in nonequilibrium conditions. Information entropy was proposed by Shannon in 1948, and has become an important concept to measure information in different systems. Both thermodynamic entropy and information entropy have been extensively applied in different fields related to the Critical Zone, such as hydrology, ecology, pedology, and geomorphology. In this study, we review the most important applications of these concepts in those fields, including how they are calculated, and how they have been utilized to analyze different processes. We then synthesize the link between thermodynamic and information entropies in the light of energy dissipation and organizational patterns, and discuss how this link may be used to enhance the understanding of the Critical Zone.

Scooped by
Rick Quax

Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for today’s digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, such definitions were given and the specific concept of local active information storage was successfully applied to the analysis and optimization of artificial neural systems. However, no attempt to measure local active information storage in neural data has been made to date. Here we measure local active information storage on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat. We show that storage reflects neural properties such as stimulus preferences and surprise upon unexpected stimulus change, and in area 18 reflects the abstract concept of an ongoing stimulus despite the locally random nature of this stimulus. We suggest that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding.

Scooped by
Rick Quax

Guided SelfOrganization: Inception (Emergence, Complexity and Computation) [Mikhail Prokopenko] on Amazon.com. *FREE* shipping on qualifying offers. Is it possible to guide the process of selforganisation towards specific patterns and outcomes?

Scooped by
Rick Quax


Scooped by
Rick Quax

We present a tentative proposal for a quantitative measure of autonomy. This is something that, surprisingly, is rarely found in the literature, even though autonomy is considered to be a basic concept in many disciplines, including artificial life. We work in an information theoretic setting for which the distinction between system and environment is the starting point. As a first measure for autonomy, we propose the conditional mutual information between consecutive states of the system conditioned on the history of the environment. This works well when the system cannot influence the environment at all and the environment does not interact synergetically with the system. When, in contrast, the system has full control over its environment, we should instead neglect the environment history and simply take the mutual information between consecutive system states as a measure of autonomy. In the case of mutual interaction between system and...

Scooped by
Rick Quax

How can the information that a set of random variables contains about another random variable be decomposed? To what extent do different subgroups provide the same, i.e. shared or redundant, information, carry unique information or interact for the emergence of synergistic information? Recently Williams and Beer proposed such a decomposition based on natural properties for shared information. While these properties fix the structure of the decomposition, they do not uniquely specify the values of the different terms. Therefore, we investigate additional properties such as strong symmetry and left monotonicity. We find that strong symmetry is incompatible with the properties proposed by Williams and Beer. Although left monotonicity is a very natural property for an information measure it is not fulfilled by any of the proposed measures. We also study a geometric framework for information decompositions and ask whether it is possible to represent shared information by a family of posterior distributions. Finally, we draw connections to the notions of shared knowledge and common knowledge in game theory. While many people believe that independent variables cannot share information, we show that in game theory independent agents can have shared knowledge, but not common knowledge. We conclude that intuition and heuristic arguments do not suffice when arguing about information.

Scooped by
Rick Quax


Scooped by
Rick Quax


Scooped by
Rick Quax


Scooped by
Rick Quax

The nature of distributed computation in complex systems has often been described in terms of memory, communication and processing. This thesis presents a complete informationtheoretic framework to quantify these operations on ...

Rescooped by
Rick Quax
from CxBooks

Communication, one of the most important functions of life, occurs at any spatial scale from the molecular one up to that of populations and ecosystems, and any time scale from that of fast chemical reactions up to that of geological ages. Information theory, a mathematical science of communication initiated by Shannon in 1948, has been very successful in engineering, but biologists ignore it. This book aims at bridging this gap. It proposes an abstract definition of information based on the engineers' experience which makes it usable in life sciences. It expounds information theory and errorcorrecting codes, its byproducts, as simply as possible. Then, the fundamental biological problem of heredity is examined. It is shown that biology does not adequately account for the conservation of genomes during geological ages, which can be understood only if it is assumed that genomes are made resilient to casual errors by proper coding. Moreover, the good conservation of very old parts of genomes, like the HOX genes, implies that the assumed genomic codes have a nested structure which makes an information the more resilient to errors, the older it is. The consequences that information theory draws from these hypotheses meet very basic but yet unexplained biological facts, e.g., the existence of successive generations, that of discrete species and the trend of evolution towards complexity. Being necessarily inscribed on physical media, information appears as a bridge between the abstract and the concrete. Recording, communicating and using information exclusively occur in the living world. Information is thus coextensive with life and delineates the border between the living and the inanimate.
Via Complexity Digest

Scooped by
Rick Quax

John. A. Wheeler's "Information, physics, quantum: the search for links"
