Information Proce...
Follow
Find
64 views | +0 today
 

From around the web

Information Processing in Complex Systems
The storage, transfer, loss, and processing of information (bits) in complex systems.
Curated by Rick Quax
Your new post is loading...
Your new post is loading...
Scooped by Rick Quax
Scoop.it!

Entropy | Special Issue : Information Processing in Complex Systems

Entropy | Special Issue : Information Processing in Complex Systems | Information Processing in Complex Systems | Scoop.it

All systems in nature have one thing in common: they process information. Information is registered in the state of a system and its elements, implicitly and invisibly. As elements interact, information is transferred and modified. Indeed, bits of information about the state of one element will travel—imperfectly—to the state of the other element, forming its new state. This storage, transfer, and modification of information, possibly between levels of a multi level system, is imperfect due to randomness or noise. From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions. Mapping out exactly how these bits of information percolate through the system could reveal new fundamental insights in how the parts orchestrate to produce the properties of the system. A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with disparate semantics, would reduce to a single language of information processing.

 

 

Rick Quax's insight:

Submission deadline: February 28, 2015

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Information-based fitness and the emergence of criticality in living systems

Information-based fitness and the emergence of criticality in living systems | Information Processing in Complex Systems | Scoop.it
Rick Quax's insight:

Please correct me if I interpret it wrong. If agent A optimizes its description of another agent B then the Kullback-Leibler divergence (between the state of B and A's representation of B) decreases. This optimization is achieved by tuning the parameters which drive the state of A. Then the Fisher information of A's representation (parameterized by B's state) naturally increases, and an increasing Fisher information is assumed to imply criticality (along the lines of 'sensitivity to parameters').

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

ECCS'14: Satellite Meeting INFORMATION PROCESSING IN COMPLEX SYSTEMS (IPCS'14)

Rick Quax's insight:

This satellite meeting centers especially around the viewpoint of considering ‘information’ to be a quantity that is inherently stored, transferred, and modified (e.g., integrated/synergy) in (complex) dynamical systems, or perhaps even more fundamentally, to start thinking of any dynamical process as a (Turing) computational process. This information-centered approach hopefully brings better theory and understanding for the case of complex systems, for which traditional tools have limited success. We are inviting like-minded researchers to meet at the satellite meeting “Information Processing in Complex Systems” or IPCS’14 during the upcoming ECCS’14 conference, now the 3rd edition.

 

We aim to grow this meeting as a central forum on the subject, within which researchers now still form dispersed and fairly isolated groups, with diverse ideas.

 

Confirmed invited speakers: Prof. dr. Kristian Lindgren, Dr. Paul Williams, and Hermann Haken will be present.

 

Schedule is now online at http://computationalscience.nl/ipcs2014/

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Information processing using a single dynamical node as complex system

Information processing using a single dynamical node as complex system | Information Processing in Complex Systems | Scoop.it

Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing.

more...
No comment yet.
Rescooped by Rick Quax from Social Foraging
Scoop.it!

Equitability, mutual information, and the maximal information coefficient

Equitability, mutual information, and the maximal information coefficient | Information Processing in Complex Systems | Scoop.it

How should one quantify the strength of association between two random variables without bias for relationships of a specific form? Despite its conceptual simplicity, this notion of statistical “equitability” has yet to receive a definitive mathematical formalization. Here we argue that equitability is properly formalized by a self-consistency condition closely related to Data Processing Inequality. Mutual information, a fundamental quantity in information theory, is shown to satisfy this equitability criterion. These findings are at odds with the recent work of Reshef et al. [Reshef DN, et al. (2011) Science 334(6062):1518–1524], which proposed an alternative definition of equitability and introduced a new statistic, the “maximal information coefficient” (MIC), said to satisfy equitability in contradistinction to mutual information. These conclusions, however, were supported only with limited simulation evidence, not with mathematical arguments. Upon revisiting these claims, we prove that the mathematical definition of equitability proposed by Reshef et al. cannot be satisfied by any (nontrivial) dependence measure. We also identify artifacts in the reported simulation evidence. When these artifacts are removed, estimates of mutual information are found to be more equitable than estimates of MIC. Mutual information is also observed to have consistently higher statistical power than MIC. We conclude that estimating mutual information provides a natural (and often practical) way to equitably quantify statistical associations in large datasets.


Via Ashish Umre
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Information Dynamics

Information Dynamics | Information Processing in Complex Systems | Scoop.it
Rick Quax's insight:

A philosophy-oriented article about the concept of 'information dynamics' and how information can be 'processed'.

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

A Trade-off between Local and Distributed Information Processing Associated with Remote Episodic versus Semantic Memory

Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.

more...
No comment yet.
Rescooped by Rick Quax from Papers
Scoop.it!

Information Flow in a Kinetic Ising Model Peaks in the Disordered Phase

There is growing evidence that for a range of dynamical systems featuring complex interactions between large ensembles of interacting elements, mutual information peaks at order-disorder phase transitions. We conjecture that, by contrast, information flow in such systems will generally peak strictly on the disordered side of a phase transition. This conjecture is verified for a ferromagnetic 2D lattice Ising model with Glauber dynamics and a transfer entropy-based measure of systemwide information flow. Implications of the conjecture are considered, in particular, that for a complex dynamical system in the process of transitioning from disordered to ordered dynamics (a mechanism implicated, for example, in financial market crashes and the onset of some types of epileptic seizures); information dynamics may be able to predict an imminent transition.

 

Lionel Barnett, Joseph T. Lizier, Michael Harré, Anil K. Seth, and Terry Bossomaier

"Information Flow in a Kinetic Ising Model Peaks in the Disordered Phase"

Physical Review Letters 111, 177203 (2013)

http://link.aps.org/doi/10.1103/PhysRevLett.111.177203


Via Complexity Digest
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Information and closure in systems theory

The notion of closure plays a prominent role in systems theory where it is used to identify or define the system in distinction from its environment and to explain the autonomy of the system. Here, we present a quantitative measure, as opposed to the already existing qualitative notions, of closure. We shall elaborate upon the observation that cognitive systems can achieve informational closure by modeling their environment. Formally, then, a system is informationally closed if (almost) no information flows into it from the environment. A system that is independent from its environment trivially achieves informational closure. Simulations of coupled hidden Markov models demonstrate that informational closure can also be realized non-trivially by modeling or controlling the environment. Our analysis of systems that actively influence their environment to achieve closure then reveals interesting connections to the related notion of autonomy. This discussion will then call into question the system-environment distinction that seems so innocent to begin with. It turns out that the notion of autonomy depends crucially on whether, not just the state observables, but also the dynamical processes are attributed to either the system or the environment. In that manner, our conceptualization of informational closure also sheds light on other, more ambitious notions of closure, e.g. organizational closure, semantic closure, closure to efficient cause or operational closure, intended as a fundamental (defining) concept of life itself.

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

A framework for the local information dynamics of distributed computation in complex systems

The nature of distributed computation has often been described in terms of the component operations of universal computation: information storage, transfer and modification. We review the first complete framework that quantifies each of these individual information dynamics on a local scale within a system, and describes the manner in which they interact to create non-trivial computation where "the whole is greater than the sum of the parts". We describe the application of the framework to cellular automata, a simple yet powerful model of distributed computation. This is an important application, because the framework is the first to provide quantitative evidence for several important conjectures about distributed computation in cellular automata: that blinkers embody information storage, particles are information transfer agents, and particle collisions are information modification events. The framework is also shown to contrast the computations conducted by several well-known cellular automata, highlighting the importance of information coherence in complex computation. The results reviewed here provide important quantitative insights into the fundamental nature of distributed computation and the dynamics of complex systems, as well as impetus for the framework to be applied to the analysis and design of other systems.

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Towards a Synergy-based Approach to Measuring Information Modification

Distributed computation in artificial life and complex systems is often described in terms of component operations on information: information storage, transfer and modification. Information modification remains poorly described however, with the popularly-understood examples of glider and particle collisions in cellular automata being only quantitatively identified to date using a heuristic (separable information) rather than a proper information-theoretic measure. We outline how a recently-introduced axiomatic framework for measuring information redundancy and synergy, called partial information decomposition, can be applied to a perspective of distributed computation in order to quantify component operations on information. Using this framework, we propose a new measure of information modification that captures the intuitive understanding of information modification events as those involving interactions between two or more information sources. We also consider how the local dynamics of information modification in space and time could be measured, and suggest a new axiom that redundancy measures would need to meet in order to make such local measurements. Finally, we evaluate the potential for existing redundancy measures to meet this localizability axiom.

more...
No comment yet.
Rescooped by Rick Quax from Papers
Scoop.it!

The diminishing role of hubs in dynamical processes on complex networks

It is notoriously difficult to predict the behaviour of a complex self-organizing system, where the interactions among dynamical units form a heterogeneous topology. Even if the dynamics of each microscopic unit is known, a real understanding of their contributions to the macroscopic system behaviour is still lacking. Here, we develop information-theoretical methods to distinguish the contribution of each individual unit to the collective out-of-equilibrium dynamics. We show that for a system of units connected by a network of interaction potentials with an arbitrary degree distribution, highly connected units have less impact on the system dynamics when compared with intermediately connected units. In an equilibrium setting, the hubs are often found to dictate the long-term behaviour. However, we find both analytically and experimentally that the instantaneous states of these units have a short-lasting effect on the state trajectory of the entire system. We present qualitative evidence of this phenomenon from empirical findings about a social network of product recommendations, a protein–protein interaction network and a neural network, suggesting that it might indeed be a widespread property in nature.

 

"The diminishing role of hubs in dynamical processes on complex networks"

Quax R, Apolloni A and Sloot P.M.A.

Journal of the Royal Society Interface, 10, 20130568, published 4 September 2013

http://dx.doi.org/10.1098/rsif.2013.0568


Via Complexity Digest
Rick Quax's insight:

The inherent storage and transmission of information bits may provide new insights into the behavior of complex systems.

more...
june holley's curator insight, September 18, 2013 1:15 PM

Hubs arent as important as we think - in complex networks!

Scooped by Rick Quax
Scoop.it!

Information dissipation as an early-warning signal for the Lehman Brothers collapse in financial time series : Scientific Reports : Nature Publishing Group

Information dissipation as an early-warning signal for the Lehman Brothers collapse in financial time series : Scientific Reports : Nature Publishing Group | Information Processing in Complex Systems | Scoop.it
In financial markets, participants locally optimize their profit which can result in a globally unstable state leading to a catastrophic change.
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Applying Information Theory to Neuronal Networks: From Theory to Experiments

Applying Information Theory to Neuronal Networks: From Theory to Experiments | Information Processing in Complex Systems | Scoop.it
Information-theory is being increasingly used to analyze complex, self-organizing processes on networks, predominantly in analytical and numerical studies. Perhaps one of the most paradigmatic complex systems is a network of neurons, in which cognition arises from the information storage, transfer, and processing among individual neurons. In this article we review experimental techniques suitable for validating information-theoretical predictions in simple neural networks, as well as generating new hypotheses. Specifically, we focus on techniques that may be used to measure both network (microcircuit) anatomy as well as neuronal activity simultaneously. This is needed to study the role of the network structure on the emergent collective dynamics, which is one of the reasons to study the characteristics of information processing. We discuss in detail two suitable techniques, namely calcium imaging and the application of multi-electrode arrays to simple neural networks in culture, and discuss their advantages and limitations in an accessible manner for non-experts. In particular, we show that each technique induces a qualitatively different type of error on the measured mutual information. The ultimate goal of this work is to bridge the gap between theorists and experimentalists in their shared goal of understanding the behavior of networks of neurons.
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

JIDT: Java Information Dynamics Toolkit for studying information-theoretic measures of computation in complex systems - Google Project Hosting

JIDT: Java Information Dynamics Toolkit for studying information-theoretic measures of computation in complex systems - Google Project Hosting | Information Processing in Complex Systems | Scoop.it

JIDT provides a stand-alone, open-source code Java implementation (usable in Matlab, Octave, Python and R of information-theoretic measures of distributed computation in complex systems: i.e. information storage, transfer and modification.

 

JIDT includes implementations:

principally for the measures transfer entropy, mutual information, and their conditional variants, as well as active information storage, entropy, etc;for both discrete and continuous-valued data;using various types of estimators (e.g. Kraskov-Stögbauer-Grassberger estimators, box-kernel estimation, linear-Gaussian),

as described in full at ImplementedMeasures.

 

JIDT is distributed under the GNU GPL v3 license (or later).

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Electronically erased

The fate of Schrödinger's cat depends on the particular path of a single electron. If the electron hits the trigger, which opens a bottle of poisonous gas, then the cat dies. If the path misses the trigger, the cat lives. According to quantum mechanics, electrons do not normally follow a definite trajectory. Instead, an electron can trace both paths at the same time. As a result, Schrödinger's cat is both dead and alive. Quantum mechanics also teaches us that a “which-path” detector—that is, any measurement device that can show where the electron travels—forces the electron to choose just one path, thus sealing the cat's destiny. But what will happen after the information collected by the detector has been erased? The results of a semiconductor version of such an experiment, reported by Weisz et al. on page 1363 of this issue (1), suggest that the cat will come back to the middle ground between life and death.

Rick Quax's insight:

Erasing information restores a quantum state? Can information indeed be a fundamental concept in physics?

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Entropy | Free Full-Text | Entropy in the Critical Zone: A Comprehensive Review

Entropy | Free Full-Text | Entropy in the Critical Zone: A Comprehensive Review | Information Processing in Complex Systems | Scoop.it
Thermodynamic entropy was initially proposed by Clausius in 1865. Since then it has been implemented in the analysis of different systems, and is seen as a promising concept to understand the evolution of open systems in non-equilibrium conditions. Information entropy was proposed by Shannon in 1948, and has become an important concept to measure information in different systems. Both thermodynamic entropy and information entropy have been extensively applied in different fields related to the Critical Zone, such as hydrology, ecology, pedology, and geomorphology. In this study, we review the most important applications of these concepts in those fields, including how they are calculated, and how they have been utilized to analyze different processes. We then synthesize the link between thermodynamic and information entropies in the light of energy dissipation and organizational patterns, and discuss how this link may be used to enhance the understanding of the Critical Zone.
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Frontiers | Local active information storage as a tool to understand distributed neural information processing | Frontiers in Neuroinformatics

Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for today’s digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, such definitions were given and the specific concept of local active information storage was successfully applied to the analysis and optimization of artificial neural systems. However, no attempt to measure local active information storage in neural data has been made to date. Here we measure local active information storage on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat. We show that storage reflects neural properties such as stimulus preferences and surprise upon unexpected stimulus change, and in area 18 reflects the abstract concept of an ongoing stimulus despite the locally random nature of this stimulus. We suggest that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding.
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Guided Self-Organization: Inception (Emergence, Complexity and Computation): Mikhail Prokopenko: 9783642537332: Amazon.com: Books

Guided Self-Organization: Inception (Emergence, Complexity and Computation) [Mikhail Prokopenko] on Amazon.com. *FREE* shipping on qualifying offers. Is it possible to guide the process of self-organisation towards specific patterns and outcomes?
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Phys. Rev. Lett. 111, 177203 (2013): Information Flow in a Kinetic Ising Model Peaks in the Disordered Phase

Phys. Rev. Lett. 111, 177203 (2013): Information Flow in a Kinetic Ising Model Peaks in the Disordered Phase | Information Processing in Complex Systems | Scoop.it
more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Autonomy: An information theoretic perspective

Autonomy: An information theoretic perspective | Information Processing in Complex Systems | Scoop.it

We present a tentative proposal for a quantitative measure of autonomy. This is something that, surprisingly, is rarely found in the literature, even though autonomy is considered to be a basic concept in many disciplines, including artificial life.

 

We work in an information theoretic setting for which the distinction between system and environment is the starting point. As a first measure for autonomy, we propose the conditional mutual information between consecutive states of the system conditioned on the history of the environment. This works well when the system cannot influence the environment at all and the environment does not interact synergetically with the system. When, in contrast, the system has full control over its environment, we should instead neglect the environment history and simply take the mutual information between consecutive system states as a measure of autonomy.

 

In the case of mutual interaction between system and...

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems

Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems | Information Processing in Complex Systems | Scoop.it

How can the information that a set

of random variables contains about another random variable

be decomposed? To what extent do different subgroups provide the same, i.e. shared or redundant, information, carry unique information or interact for the emergence of synergistic information? Recently Williams and Beer proposed such a decomposition based on natural properties for shared information. While these properties fix the structure of the decomposition, they do not uniquely specify the values of the different terms. Therefore, we investigate additional properties such as strong symmetry and left monotonicity. We find that strong symmetry is incompatible with the properties proposed by Williams and Beer. Although left monotonicity is a very natural property for an information measure it is not fulfilled by any of the proposed measures. We also study a geometric framework for information decompositions and ask whether it is possible to represent shared information by a family of posterior distributions. Finally, we draw connections to the notions of shared knowledge and common knowledge in game theory. While many people believe that independent variables cannot share information, we show that in game theory independent agents can have shared knowledge, but not common knowledge. We conclude that intuition and heuristic arguments do not suffice when arguing about information.

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Inferring effective computational connectivity using incrementally conditioned multivariate transfer entropy

Poster presentation

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

Information Causality as a Physical Principle

Information Causality as a Physical Principle | Information Processing in Complex Systems | Scoop.it

Nature

Rick Quax's insight:

Could also be relevant without the quantum aspect?

more...
No comment yet.
Scooped by Rick Quax
Scoop.it!

ECCS'13: Satellite Meeting: INFORMATION PROCESSING IN COMPLEX SYSTEMS (IPCS'13)

ECCS'13: Satellite Meeting: INFORMATION PROCESSING IN COMPLEX SYSTEMS (IPCS'13) | Information Processing in Complex Systems | Scoop.it
Rick Quax's insight:

The focus of IPCS'13 will be on information processing as a novel paradigm in understanding and modelling complex systems.

 

All systems in nature have one thing in common: they process information. Information is registered in the state of a system and its elements, implicitly and invisibly. As elements interact, information is transferred. Indeed, bits of information about the state of one element will travel – imperfectly – to the state of the other element, forming its new state. This storage and transfer of information, possibly between levels of a multi level system, is imperfect due to randomness or noise. From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions. Mapping out exactly how these bits of information percolate through the system could reveal new fundamental insights in how the parts orchestrate to produce the properties of the system. A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with disparate semantics, would reduce to a single language of information processing.

more...
No comment yet.