The storage, transfer, loss, and processing of information (bits) in complex systems. This is a bulletin board for articles on the topic, without any implied opinion on its correctness or expected impact.
Quantifying synergy among stochastic variables is an important open problem in information theory. Information synergy occurs when multiple sources together predict an outcome variable better than the sum of single-source predictions. It is an essential phenomenon in biology such as in neuronal networks and cellular regulatory processes, where different information flows integrate to produce a single response, but also in social cooperation processes as well as in statistical inference tasks in machine learning. Here we propose a metric of synergistic entropy and synergistic information from first principles. The proposed measure relies on so-called synergistic random variables (SRVs) which are constructed to have zero mutual information about individual source variables but non-zero mutual information about the complete set of source variables. We prove several basic and desired properties of our measure, including bounds and additivity properties. In addition, we prove several important consequences of our measure, including the fact that different types of synergistic information may co-exist between the same sets of variables. A numerical implementation is provided, which we use to demonstrate that synergy is associated with resilience to noise. Our measure may be a marked step forward in the study of multivariate information theory and its numerous applications.
Rick Quax's insight:
In our publication in Entropy we open up a completely new way of thinking about synergy. The currently dominating school of thought dates back to 2008 with the idea that synergistic information and individual information must sum up to the total amount of information. Although intuitive, unfortunately to date there seems to be no satisfactory axiom set and accompanying formula which has earned the consensus. Our proposal starts completely from scratch and departs from two very basic ideas: (1) that an output variable is fully synergistic when it stores zero information about any individual input; and (2) that a partially synergistic variable will correlate with a fully synergistic variable. Then we simply follow where the mathematics takes us. In the end we find a promising and well-defined formula for quantifying synergy. We can use it to prove various interesting things, such as the maximum amount of synergy that a fully synergistic variable can store about other variables. We feel it is a very promising path. Whether this new path leads to (part of) the solution remains to be seen.
Accurately determining dependency structure is critical to discovering a system's causal organization. We recently showed that the transfer entropy fails in a key aspect of this---measuring information flow---due to its conflation of dyadic and polyadic relationships. We extend this observation to demonstrate that this is true of all such Shannon information measures when used to analyze multivariate dependencies. This has broad implications, particularly when employing information to express the organization and mechanisms embedded in complex systems, including the burgeoning efforts to combine complex network theory with information theory. Here, we do not suggest that any aspect of information theory is wrong. Rather, the vast majority of its informational measures are simply inadequate for determining the meaningful dependency structure within joint probability distributions. Therefore, such information measures are inadequate for discovering intrinsic causal relations. We close by demonstrating that such distributions exist across an arbitrary set of variables.
Theme issue ‘Quantum foundations: information approach’ compiled and edited by Giacomo Mauro D'Ariano and Andrei Khrennikov.
"This special issue is based on the contributions of a group of top experts in quantum foundations and quantum information and probability. It enlightens a number of interpretational, mathematical and experimental problems of quantum theory."
The yearly Information Processing in Complex Systems (IPCS) satellite meeting is organized during the Conference on Complex Systems. Our objective is to provide a forum for researchers who follow an information-theoretic approach to complex systems. Here they can present recent achievements and discuss promising hypotheses and further research directions.
The IPCS’16 edition will combine classical and quantum information approaches. If you would like to attend this scientific discourse then you only need to register for CCS’16. If you would like to present your research in a 20-minute presentation then please submit your short abstract to our submission page before July 10.
Our satellite meeting will be a full-day session on September 22 in the Beurs van Berlage in Amsterdam. Room will be announced later.
Abstract We quantify characteristics of the informational architecture of two representative biological networks: the Boolean network model for the cell-cycle regulatory network of the fission yeast Schizosaccharomyces [...]...
Measures of information transfer have become a popular approach to analyze interactions in complex systems such as the Earth or the human brain from measured time series. Recent work has focused on causal definitions of information transfer aimed at decompositions of predictive information about a target variable, while excluding effects of common drivers and indirect influences. While common drivers clearly constitute a spurious causality, the aim of the present article is to develop measures quantifying different notions of the strength of information transfer along indirect causal paths, based on first reconstructing the multivariate causal network. Another class of novel measures quantifies to what extent different intermediate processes on causal paths contribute to an interaction mechanism to determine pathways of causal information transfer. The proposed framework complements predictive decomposition schemes by focusing more on the interaction mechanism between multiple processes. A rigorous mathematical framework allows for a clear information-theoretic interpretation that can also be related to the underlying dynamics as proven for certain classes of processes. Generally, however, estimates of information transfer remain hard to interpret for nonlinearly intertwined complex systems. But if experiments or mathematical models are not available, then measuring pathways of information transfer within the causal dependency structure allows at least for an abstraction of the dynamics. The measures are illustrated on a climatological example to disentangle pathways of atmospheric flow over Europe.
The topic is the inherent storage, transfer, and processing of information in any dynamical system ‘simply doing its thing’ in a quantitative, information-theoretical sense (such as, but not limited to, Shannon’s). That is, in a system of interacting units (e.g., spins, or neurons), the idea is that multiple flows of information reach a unit, where they are processed to new information, which then flows onward, etc. However, it is still an open question how to define and quantify such ‘information processing’, or even whether different frameworks are needed for different questions. Our overarching hypothesis is that studying this underlying process may lead to a better understanding of the emergent behavior of complex systems. The idea of the meeting is that each presentation is a piece of the puzzle and that from combining them hopefully a complete picture will emerge someday.
Understanding how information about external stimuli is transformed into behavior is one of the central goals of neuroscience. Here we characterize the information flow through a complete sensorimotor circuit: from stimulus, to sensory neurons, to interneurons, to motor neurons, to muscles, to motion. Specifically, we apply a recently developed framework for quantifying information flow to a previously published ensemble of models of salt klinotaxis in the nematode worm C. elegans. The models are grounded in the neuroanatomy and currently known neurophysiology of the worm. The unknown model parameters were optimized to reproduce the worm's behavior. Information flow analysis reveals several key principles underlying how the models operate: (1) Interneuron class AIY is responsible for integrating information about positive and negative changes in concentration, and exhibits a strong left/right information asymmetry. (2) Gap junctions play a crucial role in the transfer of information responsible for the information symmetry observed in interneuron class AIZ. (3) Neck motor neuron class SMB implements an information gating mechanism that underlies the circuit's state-dependent response. (4) The neck carries non-uniform distribution about changes in concentration. Thus, not all directions of movement are equally informative. Each of these findings corresponds to an experimental prediction that could be tested in the worm to greatly refine our understanding of the neural circuit underlying klinotaxis. Information flow analysis also allows us to explore how information flow relates to underlying electrophysiology. Despite large variations in the neural parameters of individual circuits, the overall information flow architecture circuit is remarkably consistent across the ensemble, suggesting that information flow analysis captures general principles of operation for the klinotaxis circuit.
"There is a widespread assumption that the universe in general, and life in particular, is 'getting more complex with time'. This book brings together a wide range of experts in science, philosophy and theology and unveils their joint effort in exploring this idea. They confront essential problems behind the theory of complexity and the role of life within it: what is complexity? When does it increase, and why? Is the universe evolving towards states of ever greater complexity and diversity? If so, what is the source of this universal enrichment? This book addresses those difficult questions, and offers a unique cross-disciplinary perspective on some of the most profound issues at the heart of science and philosophy. Readers will gain insights in complexity that reach deep into key areas of physics, biology, complexity science, philosophy and religion."
Rick Quax's insight:
Also contains quite some clues about information-theoretical perspectives especially Davies' and Lineweaver's contributions.
All systems in nature have one thing in common: they process information. Information is registered in the state of a system and its elements, implicitly and invisibly. As elements interact, information is transferred and modified. Indeed, bits of information about the state of one element will travel—imperfectly—to the state of the other element, forming its new state. This storage, transfer, and modification of information, possibly between levels of a multi level system, is imperfect due to randomness or noise. From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions. Mapping out exactly how these bits of information percolate through the system could reveal new fundamental insights in how the parts orchestrate to produce the properties of the system. A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with disparate semantics, would reduce to a single language of information processing.
Please correct me if I interpret it wrong. If agent A optimizes its description of another agent B then the Kullback-Leibler divergence (between the state of B and A's representation of B) decreases. This optimization is achieved by tuning the parameters which drive the state of A. Then the Fisher information of A's representation (parameterized by B's state) naturally increases, and an increasing Fisher information is assumed to imply criticality (along the lines of 'sensitivity to parameters').
This satellite meeting centers especially around the viewpoint of considering ‘information’ to be a quantity that is inherently stored, transferred, and modified (e.g., integrated/synergy) in (complex) dynamical systems, or perhaps even more fundamentally, to start thinking of any dynamical process as a (Turing) computational process. This information-centered approach hopefully brings better theory and understanding for the case of complex systems, for which traditional tools have limited success. We are inviting like-minded researchers to meet at the satellite meeting “Information Processing in Complex Systems” or IPCS’14 during the upcoming ECCS’14 conference, now the 3rd edition.
We aim to grow this meeting as a central forum on the subject, within which researchers now still form dispersed and fairly isolated groups, with diverse ideas.
Confirmed invited speakers: Prof. dr. Kristian Lindgren, Dr. Paul Williams, and Hermann Haken will be present.
Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing.
The interactions between three or more random variables are often nontrivial, poorly understood and, yet, are paramount for future advances in fields such as network information theory, neuroscience and genetics. In this work, we analyze these interactions as different modes of information sharing. Towards this end, and in contrast to most of the literature that focuses on analyzing the mutual information, we introduce an axiomatic framework for decomposing the joint entropy that characterizes the various ways in which random variables can share information. Our framework distinguishes between interdependencies where the information is shared redundantly and synergistic interdependencies where the sharing structure exists in the whole, but not between the parts. The key contribution of our approach is to focus on symmetric properties of this sharing, which do not depend on a specific point of view for differentiating roles between its components. We show that our axioms determine unique formulas for all of the terms of the proposed decomposition for systems of three variables in several cases of interest. Moreover, we show how these results can be applied to several network information theory problems, providing a more intuitive understanding of their fundamental limits.
Quantifying cooperation or synergy among random variables in predicting a single target random variable is an important problem in many complex systems. We review three prior information-theoretic measures of synergy and introduce a novel synergy measure defined as the difference between the whole and the union of its parts. We apply all four measures against a suite of binary circuits to demonstrate that our measure alone quantifies the intuitive concept of synergy across all examples. We show that for our measure of synergy that independent predictors can have positive redundant information.
There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science.
Transfer entropy is a recently introduced information-theoretic measure quantifying directed statistical coherence between spatiotemporal processes, and is widely used in diverse fields ranging from finance to neuroscience. However, its relationships to fundamental limits of computation, such as Landauer's limit, remain unknown. Here we show that in order to increase transfer entropy (predictability) by one bit, heat flow must match or exceed Landauer's limit. Importantly, we generalise Landauer's limit to bi-directional information dynamics for non-equilibrium processes, revealing that the limit applies to prediction, in addition to retrodiction (information erasure). Furthermore, the results are related to negentropy, and to Bremermann's limit and the Bekenstein bound, producing, perhaps surprisingly, lower bounds on the computational deceleration and information loss incurred during an increase in predictability about the process. The identified relationships set new computational limits in terms of fundamental physical quantities, and establish transfer entropy as a central measure connecting information theory, thermodynamics and theory of computation.
Information-theory is being increasingly used to analyze complex, self-organizing processes on networks, predominantly in analytical and numerical studies. Perhaps one of the most paradigmatic complex systems is a network of neurons, in which cognition arises from the information storage, transfer, and processing among individual neurons. In this article we review experimental techniques suitable for validating information-theoretical predictions in simple neural networks, as well as generating new hypotheses. Specifically, we focus on techniques that may be used to measure both network (microcircuit) anatomy as well as neuronal activity simultaneously. This is needed to study the role of the network structure on the emergent collective dynamics, which is one of the reasons to study the characteristics of information processing. We discuss in detail two suitable techniques, namely calcium imaging and the application of multi-electrode arrays to simple neural networks in culture, and discuss their advantages and limitations in an accessible manner for non-experts. In particular, we show that each technique induces a qualitatively different type of error on the measured mutual information. The ultimate goal of this work is to bridge the gap between theorists and experimentalists in their shared goal of understanding the behavior of networks of neurons.
JIDT provides a stand-alone, open-source code Java implementation (usable in Matlab, Octave, Python and R of information-theoretic measures of distributed computation in complex systems: i.e. information storage, transfer and modification.
JIDT includes implementations:
principally for the measures transfer entropy, mutual information, and their conditional variants, as well as active information storage, entropy, etc;for both discrete and continuous-valued data;using various types of estimators (e.g. Kraskov-Stögbauer-Grassberger estimators, box-kernel estimation, linear-Gaussian),
as described in full at ImplementedMeasures.
JIDT is distributed under the GNU GPL v3 license (or later).
The fate of Schrödinger's cat depends on the particular path of a single electron. If the electron hits the trigger, which opens a bottle of poisonous gas, then the cat dies. If the path misses the trigger, the cat lives. According to quantum mechanics, electrons do not normally follow a definite trajectory. Instead, an electron can trace both paths at the same time. As a result, Schrödinger's cat is both dead and alive. Quantum mechanics also teaches us that a “which-path” detector—that is, any measurement device that can show where the electron travels—forces the electron to choose just one path, thus sealing the cat's destiny. But what will happen after the information collected by the detector has been erased? The results of a semiconductor version of such an experiment, reported by Weisz et al. on page 1363 of this issue (1), suggest that the cat will come back to the middle ground between life and death.
Rick Quax's insight:
Erasing information restores a quantum state? Can information indeed be a fundamental concept in physics?
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.