Information Processing in Complex Systems
253 views | +0 today
Follow
Information Processing in Complex Systems
The storage, transfer, loss, and processing of information (bits) in complex systems. This is a bulletin board for articles on the topic, without any implied opinion on its correctness or expected impact.
Curated by Rick Quax
Your new post is loading...
Your new post is loading...
Scooped by Rick Quax
Scoop.it!

Understanding Interdependency Through Complex Information Sharing

Understanding Interdependency Through Complex Information Sharing | Information Processing in Complex Systems | Scoop.it
The interactions between three or more random variables are often nontrivial, poorly understood and, yet, are paramount for future advances in fields such as network information theory, neuroscience and genetics. In this work, we analyze these interactions as different modes of information sharing. Towards this end, and in contrast to most of the literature that focuses on analyzing the mutual information, we introduce an axiomatic framework for decomposing the joint entropy that characterizes the various ways in which random variables can share information. Our framework distinguishes between interdependencies where the information is shared redundantly and synergistic interdependencies where the sharing structure exists in the whole, but not between the parts. The key contribution of our approach is to focus on symmetric properties of this sharing, which do not depend on a specific point of view for differentiating roles between its components. We show that our axioms determine unique formulas for all of the terms of the proposed decomposition for systems of three variables in several cases of interest. Moreover, we show how these results can be applied to several network information theory problems, providing a more intuitive understanding of their fundamental limits.
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Thermodynamics of Error Correction

Thermodynamics of Error Correction | Information Processing in Complex Systems | Scoop.it
Copying information is fundamental in both nature and human industries. Researchers show how the accuracy of a simple copying process is related to thermodynamics.
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences: 374 (2063)

Philosophical Transactions of the Royal Society A: Mathematical, 				Physical and Engineering Sciences: 374 (2063) | Information Processing in Complex Systems | Scoop.it
Theme issue ‘DNA as information’ compiled and edited by Julyan H. E. Cartwright, Simone Giannerini and Diego L. González
Rick Quax's insight:

Compilation of very interesting viewpoints by interesting people, not necessarily specific to 'DNA'.

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Quantifying synergistic mutual information

Quantifying cooperation or synergy among random variables in predicting a single target random variable is an important problem in many complex systems. We review three prior information-theoretic measures of synergy and introduce a novel synergy measure defined as the difference between the whole and the union of its parts. We apply all four measures against a suite of binary circuits to demonstrate that our measure alone quantifies the intuitive concept of synergy across all examples. We show that for our measure of synergy that independent predictors can have positive redundant information.

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Information Processing and Dynamics in Minimally Cognitive Agents

Information Processing and Dynamics in Minimally Cognitive Agents | Information Processing in Complex Systems | Scoop.it

There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science.

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Transfer Entropy and Transient Limits of Computation (Scientific Reports, Nature Publishing Group)

Transfer Entropy and Transient Limits of Computation (Scientific Reports, Nature Publishing Group) | Information Processing in Complex Systems | Scoop.it
Transfer entropy is a recently introduced information-theoretic measure quantifying directed statistical coherence between spatiotemporal processes, and is widely used in diverse fields ranging from finance to neuroscience. However, its relationships to fundamental limits of computation, such as Landauer's limit, remain unknown. Here we show that in order to increase transfer entropy (predictability) by one bit, heat flow must match or exceed Landauer's limit. Importantly, we generalise Landauer's limit to bi-directional information dynamics for non-equilibrium processes, revealing that the limit applies to prediction, in addition to retrodiction (information erasure). Furthermore, the results are related to negentropy, and to Bremermann's limit and the Bekenstein bound, producing, perhaps surprisingly, lower bounds on the computational deceleration and information loss incurred during an increase in predictability about the process. The identified relationships set new computational limits in terms of fundamental physical quantities, and establish transfer entropy as a central measure connecting information theory, thermodynamics and theory of computation.
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Applying Information Theory to Neuronal Networks: From Theory to Experiments

Applying Information Theory to Neuronal Networks: From Theory to Experiments | Information Processing in Complex Systems | Scoop.it
Information-theory is being increasingly used to analyze complex, self-organizing processes on networks, predominantly in analytical and numerical studies. Perhaps one of the most paradigmatic complex systems is a network of neurons, in which cognition arises from the information storage, transfer, and processing among individual neurons. In this article we review experimental techniques suitable for validating information-theoretical predictions in simple neural networks, as well as generating new hypotheses. Specifically, we focus on techniques that may be used to measure both network (microcircuit) anatomy as well as neuronal activity simultaneously. This is needed to study the role of the network structure on the emergent collective dynamics, which is one of the reasons to study the characteristics of information processing. We discuss in detail two suitable techniques, namely calcium imaging and the application of multi-electrode arrays to simple neural networks in culture, and discuss their advantages and limitations in an accessible manner for non-experts. In particular, we show that each technique induces a qualitatively different type of error on the measured mutual information. The ultimate goal of this work is to bridge the gap between theorists and experimentalists in their shared goal of understanding the behavior of networks of neurons.
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

JIDT: Java Information Dynamics Toolkit for studying information-theoretic measures of computation in complex systems - Google Project Hosting

JIDT: Java Information Dynamics Toolkit for studying information-theoretic measures of computation in complex systems - Google Project Hosting | Information Processing in Complex Systems | Scoop.it

JIDT provides a stand-alone, open-source code Java implementation (usable in Matlab, Octave, Python and R of information-theoretic measures of distributed computation in complex systems: i.e. information storage, transfer and modification.

 

JIDT includes implementations:

principally for the measures transfer entropy, mutual information, and their conditional variants, as well as active information storage, entropy, etc;for both discrete and continuous-valued data;using various types of estimators (e.g. Kraskov-Stögbauer-Grassberger estimators, box-kernel estimation, linear-Gaussian),

as described in full at ImplementedMeasures.

 

JIDT is distributed under the GNU GPL v3 license (or later).

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Electronically erased

The fate of Schrödinger's cat depends on the particular path of a single electron. If the electron hits the trigger, which opens a bottle of poisonous gas, then the cat dies. If the path misses the trigger, the cat lives. According to quantum mechanics, electrons do not normally follow a definite trajectory. Instead, an electron can trace both paths at the same time. As a result, Schrödinger's cat is both dead and alive. Quantum mechanics also teaches us that a “which-path” detector—that is, any measurement device that can show where the electron travels—forces the electron to choose just one path, thus sealing the cat's destiny. But what will happen after the information collected by the detector has been erased? The results of a semiconductor version of such an experiment, reported by Weisz et al. on page 1363 of this issue (1), suggest that the cat will come back to the middle ground between life and death.

Rick Quax's insight:

Erasing information restores a quantum state? Can information indeed be a fundamental concept in physics?

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Entropy | Free Full-Text | Entropy in the Critical Zone: A Comprehensive Review

Entropy | Free Full-Text | Entropy in the Critical Zone: A Comprehensive Review | Information Processing in Complex Systems | Scoop.it
Thermodynamic entropy was initially proposed by Clausius in 1865. Since then it has been implemented in the analysis of different systems, and is seen as a promising concept to understand the evolution of open systems in non-equilibrium conditions. Information entropy was proposed by Shannon in 1948, and has become an important concept to measure information in different systems. Both thermodynamic entropy and information entropy have been extensively applied in different fields related to the Critical Zone, such as hydrology, ecology, pedology, and geomorphology. In this study, we review the most important applications of these concepts in those fields, including how they are calculated, and how they have been utilized to analyze different processes. We then synthesize the link between thermodynamic and information entropies in the light of energy dissipation and organizational patterns, and discuss how this link may be used to enhance the understanding of the Critical Zone.
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Frontiers | Local active information storage as a tool to understand distributed neural information processing | Frontiers in Neuroinformatics

Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for today’s digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, such definitions were given and the specific concept of local active information storage was successfully applied to the analysis and optimization of artificial neural systems. However, no attempt to measure local active information storage in neural data has been made to date. Here we measure local active information storage on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat. We show that storage reflects neural properties such as stimulus preferences and surprise upon unexpected stimulus change, and in area 18 reflects the abstract concept of an ongoing stimulus despite the locally random nature of this stimulus. We suggest that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding.
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Guided Self-Organization: Inception (Emergence, Complexity and Computation): Mikhail Prokopenko: 9783642537332: Amazon.com: Books

Guided Self-Organization: Inception (Emergence, Complexity and Computation) [Mikhail Prokopenko] on Amazon.com. *FREE* shipping on qualifying offers. Is it possible to guide the process of self-organisation towards specific patterns and outcomes?
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Phys. Rev. Lett. 111, 177203 (2013): Information Flow in a Kinetic Ising Model Peaks in the Disordered Phase

Phys. Rev. Lett. 111, 177203 (2013): Information Flow in a Kinetic Ising Model Peaks in the Disordered Phase | Information Processing in Complex Systems | Scoop.it
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

New scaling relation for information transfer in biological networks

New scaling relation for information transfer in biological networks | Information Processing in Complex Systems | Scoop.it
Abstract We quantify characteristics of the informational architecture of two representative biological networks: the Boolean network model for the cell-cycle regulatory network of the fission yeast Schizosaccharomyces [...]...
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Quantifying information transfer and mediation along causal pathways in complex systems

Quantifying information transfer and mediation along causal pathways in complex systems | Information Processing in Complex Systems | Scoop.it
Measures of information transfer have become a popular approach to analyze interactions in complex systems such as the Earth or the human brain from measured time series. Recent work has focused on causal definitions of information transfer aimed at decompositions of predictive information about a target variable, while excluding effects of common drivers and indirect influences. While common drivers clearly constitute a spurious causality, the aim of the present article is to develop measures quantifying different notions of the strength of information transfer along indirect causal paths, based on first reconstructing the multivariate causal network. Another class of novel measures quantifies to what extent different intermediate processes on causal paths contribute to an interaction mechanism to determine pathways of causal information transfer. The proposed framework complements predictive decomposition schemes by focusing more on the interaction mechanism between multiple processes. A rigorous mathematical framework allows for a clear information-theoretic interpretation that can also be related to the underlying dynamics as proven for certain classes of processes. Generally, however, estimates of information transfer remain hard to interpret for nonlinearly intertwined complex systems. But if experiments or mathematical models are not available, then measuring pathways of information transfer within the causal dependency structure allows at least for an abstraction of the dynamics. The measures are illustrated on a climatological example to disentangle pathways of atmospheric flow over Europe.
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Satellite Meeting INFORMATION PROCESSING IN COMPLEX SYSTEMS (IPCS'15)

Satellite Meeting INFORMATION PROCESSING IN COMPLEX SYSTEMS (IPCS'15) | Information Processing in Complex Systems | Scoop.it
Rick Quax's insight:

As part of the upcoming CCS’15 (http://www.ccs2015.org/) we organize the 4th satellite meeting titled “Information Processing in Complex Systems” (www.computationalscience.nl/ipcs2015).

 

The topic is the inherent storage, transfer, and processing of information in any dynamical system ‘simply doing its thing’ in a quantitative, information-theoretical sense (such as, but not limited to, Shannon’s). That is, in a system of interacting units (e.g., spins, or neurons), the idea is that multiple flows of information reach a unit, where they are processed to new information, which then flows onward, etc. However, it is still an open question how to define and quantify such ‘information processing’, or even whether different frameworks are needed for different questions. Our overarching hypothesis is that studying this underlying process may lead to a better understanding of the emergent behavior of complex systems. The idea of the meeting is that each presentation is a piece of the puzzle and that from combining them hopefully a complete picture will emerge someday.

 

Submit your abstract through www.computationalscience.nl/ipcs2015 before June 25!

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Information flow through a model of the C. elegans klinotaxis circuit

Information flow through a model of the C. elegans klinotaxis circuit | Information Processing in Complex Systems | Scoop.it

Understanding how information about external stimuli is transformed into behavior is one of the central goals of neuroscience. Here we characterize the information flow through a complete sensorimotor circuit: from stimulus, to sensory neurons, to interneurons, to motor neurons, to muscles, to motion. Specifically, we apply a recently developed framework for quantifying information flow to a previously published ensemble of models of salt klinotaxis in the nematode worm C. elegans. The models are grounded in the neuroanatomy and currently known neurophysiology of the worm. The unknown model parameters were optimized to reproduce the worm's behavior. Information flow analysis reveals several key principles underlying how the models operate: (1) Interneuron class AIY is responsible for integrating information about positive and negative changes in concentration, and exhibits a strong left/right information asymmetry. (2) Gap junctions play a crucial role in the transfer of information responsible for the information symmetry observed in interneuron class AIZ. (3) Neck motor neuron class SMB implements an information gating mechanism that underlies the circuit's state-dependent response. (4) The neck carries non-uniform distribution about changes in concentration. Thus, not all directions of movement are equally informative. Each of these findings corresponds to an experimental prediction that could be tested in the worm to greatly refine our understanding of the neural circuit underlying klinotaxis. Information flow analysis also allows us to explore how information flow relates to underlying electrophysiology. Despite large variations in the neural parameters of individual circuits, the overall information flow architecture circuit is remarkably consistent across the ensemble, suggesting that information flow analysis captures general principles of operation for the klinotaxis circuit.

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Complexity and the Arrow of Time | History, philosophy and foundations of physics

Complexity and the Arrow of Time | History, philosophy and foundations of physics | Information Processing in Complex Systems | Scoop.it

"There is a widespread assumption that the universe in general, and life in particular, is 'getting more complex with time'. This book brings together a wide range of experts in science, philosophy and theology and unveils their joint effort in exploring this idea. They confront essential problems behind the theory of complexity and the role of life within it: what is complexity? When does it increase, and why? Is the universe evolving towards states of ever greater complexity and diversity? If so, what is the source of this universal enrichment? This book addresses those difficult questions, and offers a unique cross-disciplinary perspective on some of the most profound issues at the heart of science and philosophy. Readers will gain insights in complexity that reach deep into key areas of physics, biology, complexity science, philosophy and religion."

Rick Quax's insight:

Also contains quite some clues about information-theoretical perspectives especially Davies' and Lineweaver's contributions.

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Entropy | Special Issue : Information Processing in Complex Systems

Entropy | Special Issue : Information Processing in Complex Systems | Information Processing in Complex Systems | Scoop.it

All systems in nature have one thing in common: they process information. Information is registered in the state of a system and its elements, implicitly and invisibly. As elements interact, information is transferred and modified. Indeed, bits of information about the state of one element will travel—imperfectly—to the state of the other element, forming its new state. This storage, transfer, and modification of information, possibly between levels of a multi level system, is imperfect due to randomness or noise. From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions. Mapping out exactly how these bits of information percolate through the system could reveal new fundamental insights in how the parts orchestrate to produce the properties of the system. A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with disparate semantics, would reduce to a single language of information processing.

 

 

Rick Quax's insight:

Submission deadline: February 28, 2015

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Information-based fitness and the emergence of criticality in living systems

Information-based fitness and the emergence of criticality in living systems | Information Processing in Complex Systems | Scoop.it
Rick Quax's insight:

Please correct me if I interpret it wrong. If agent A optimizes its description of another agent B then the Kullback-Leibler divergence (between the state of B and A's representation of B) decreases. This optimization is achieved by tuning the parameters which drive the state of A. Then the Fisher information of A's representation (parameterized by B's state) naturally increases, and an increasing Fisher information is assumed to imply criticality (along the lines of 'sensitivity to parameters').

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

ECCS'14: Satellite Meeting INFORMATION PROCESSING IN COMPLEX SYSTEMS (IPCS'14)

Rick Quax's insight:

This satellite meeting centers especially around the viewpoint of considering ‘information’ to be a quantity that is inherently stored, transferred, and modified (e.g., integrated/synergy) in (complex) dynamical systems, or perhaps even more fundamentally, to start thinking of any dynamical process as a (Turing) computational process. This information-centered approach hopefully brings better theory and understanding for the case of complex systems, for which traditional tools have limited success. We are inviting like-minded researchers to meet at the satellite meeting “Information Processing in Complex Systems” or IPCS’14 during the upcoming ECCS’14 conference, now the 3rd edition.

 

We aim to grow this meeting as a central forum on the subject, within which researchers now still form dispersed and fairly isolated groups, with diverse ideas.

 

Confirmed invited speakers: Prof. dr. Kristian Lindgren, Dr. Paul Williams, and Hermann Haken will be present.

 

Schedule is now online at http://computationalscience.nl/ipcs2014/

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Information processing using a single dynamical node as complex system

Information processing using a single dynamical node as complex system | Information Processing in Complex Systems | Scoop.it

Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing.

more...
No comment yet.
Rescooped by Rick Quax from Social Foraging
Scoop.it!

Equitability, mutual information, and the maximal information coefficient

Equitability, mutual information, and the maximal information coefficient | Information Processing in Complex Systems | Scoop.it

How should one quantify the strength of association between two random variables without bias for relationships of a specific form? Despite its conceptual simplicity, this notion of statistical “equitability” has yet to receive a definitive mathematical formalization. Here we argue that equitability is properly formalized by a self-consistency condition closely related to Data Processing Inequality. Mutual information, a fundamental quantity in information theory, is shown to satisfy this equitability criterion. These findings are at odds with the recent work of Reshef et al. [Reshef DN, et al. (2011) Science 334(6062):1518–1524], which proposed an alternative definition of equitability and introduced a new statistic, the “maximal information coefficient” (MIC), said to satisfy equitability in contradistinction to mutual information. These conclusions, however, were supported only with limited simulation evidence, not with mathematical arguments. Upon revisiting these claims, we prove that the mathematical definition of equitability proposed by Reshef et al. cannot be satisfied by any (nontrivial) dependence measure. We also identify artifacts in the reported simulation evidence. When these artifacts are removed, estimates of mutual information are found to be more equitable than estimates of MIC. Mutual information is also observed to have consistently higher statistical power than MIC. We conclude that estimating mutual information provides a natural (and often practical) way to equitably quantify statistical associations in large datasets.


Via Ashish Umre
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Information Dynamics

Information Dynamics | Information Processing in Complex Systems | Scoop.it
Rick Quax's insight:

A philosophy-oriented article about the concept of 'information dynamics' and how information can be 'processed'.

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

A Trade-off between Local and Distributed Information Processing Associated with Remote Episodic versus Semantic Memory

Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.

more...
No comment yet.