Information Processing in Complex Systems | Scoop.it
http://www.scoop.it/t/information-processing-in-complex-systems
All the new curated posts for the topic: Information Processing in Complex SystemsThu, 11 Feb 2016 01:53:20 GMTRick Quax2016-02-11T01:53:20ZInformation Processing in Complex Systems | Scoop.ithttp://img.scoop.it/P8J5QOIPJpqXumsc90SeMX96MkXN2Bo-CBIMTdOCamM=
http://www.scoop.it/t/information-processing-in-complex-systems
-1-1Satellite Meeting INFORMATION PROCESSING IN COMPLEX SYSTEMS (IPCS'15)
http://www.scoop.it/t/information-processing-in-complex-systems/p/4045022192/2015/06/03/satellite-meeting-information-processing-in-complex-systems-ipcs-15
<img src='http://img.scoop.it/P8J5QOIPJpqXumsc90SeMTl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><br/>Rick Quax's insight:<br/>As part of the upcoming CCS’15 (<a href="http://www.ccs2015.org" rel="nofollow">http://www.ccs2015.org</a>/) we organize the 4th satellite meeting titled “Information Processing in Complex Systems” (<a href="http://www.computationalscience.nl/ipcs2015" rel="nofollow">www.computationalscience.nl/ipcs2015</a>).<br/> <br/>The topic is the inherent storage, transfer, and processing of information in any dynamical system ‘simply doing its thing’ in a quantitative, information-theoretical sense (such as, but not limited to, Shannon’s). That is, in a system of interacting units (e.g., spins, or neurons), the idea is that multiple flows of information reach a unit, where they are processed to new information, which then flows onward, etc. However, it is still an open question how to define and quantify such ‘information processing’, or even whether different frameworks are needed for different questions. Our overarching hypothesis is that studying this underlying process may lead to a better understanding of the emergent behavior of complex systems. The idea of the meeting is that each presentation is a piece of the puzzle and that from combining them hopefully a complete picture will emerge someday.<br/> <br/>Submit your abstract through <a href="http://www.computationalscience.nl/ipcs2015" rel="nofollow">www.computationalscience.nl/ipcs2015</a> before June 25!<img src='http://www.scoop.it/rv?p=4045022192&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4045022192/2015/06/03/satellite-meeting-information-processing-in-complex-systems-ipcs-15'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Quantifying synergistic mutual information
http://www.scoop.it/t/information-processing-in-complex-systems/p/4039257938/2015/03/16/quantifying-synergistic-mutual-information
<p>Quantifying cooperation or synergy among random variables in predicting a single target random variable is an important problem in many complex systems. We review three prior information-theoretic measures of synergy and introduce a novel synergy measure defined as the difference between the whole and the union of its parts. We apply all four measures against a suite of binary circuits to demonstrate that our measure alone quantifies the intuitive concept of synergy across all examples. We show that for our measure of synergy that independent predictors can have positive redundant information.</p><img src='http://www.scoop.it/rv?p=4039257938&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4039257938/2015/03/16/quantifying-synergistic-mutual-information'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Information flow through a model of the C. elegans klinotaxis circuit
http://www.scoop.it/t/information-processing-in-complex-systems/p/4038910857/2015/03/11/information-flow-through-a-model-of-the-c-elegans-klinotaxis-circuit
<img src='http://img.scoop.it/5RR-nzhEXdiyCNwpdwAJhzl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><p>Understanding how information about external stimuli is transformed into behavior is one of the central goals of neuroscience. Here we characterize the information flow through a complete sensorimotor circuit: from stimulus, to sensory neurons, to interneurons, to motor neurons, to muscles, to motion. Specifically, we apply a recently developed framework for quantifying information flow to a previously published ensemble of models of salt klinotaxis in the nematode worm C. elegans. The models are grounded in the neuroanatomy and currently known neurophysiology of the worm. The unknown model parameters were optimized to reproduce the worm's behavior. Information flow analysis reveals several key principles underlying how the models operate: (1) Interneuron class AIY is responsible for integrating information about positive and negative changes in concentration, and exhibits a strong left/right information asymmetry. (2) Gap junctions play a crucial role in the transfer of information responsible for the information symmetry observed in interneuron class AIZ. (3) Neck motor neuron class SMB implements an information gating mechanism that underlies the circuit's state-dependent response. (4) The neck carries non-uniform distribution about changes in concentration. Thus, not all directions of movement are equally informative. Each of these findings corresponds to an experimental prediction that could be tested in the worm to greatly refine our understanding of the neural circuit underlying klinotaxis. Information flow analysis also allows us to explore how information flow relates to underlying electrophysiology. Despite large variations in the neural parameters of individual circuits, the overall information flow architecture circuit is remarkably consistent across the ensemble, suggesting that information flow analysis captures general principles of operation for the klinotaxis circuit.</p><img src='http://www.scoop.it/rv?p=4038910857&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4038910857/2015/03/11/information-flow-through-a-model-of-the-c-elegans-klinotaxis-circuit'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Information Processing and Dynamics in Minimally Cognitive Agents
http://www.scoop.it/t/information-processing-in-complex-systems/p/4038470603/2015/03/04/information-processing-and-dynamics-in-minimally-cognitive-agents
<img src='http://img.scoop.it/h3ir21QkI_Gk6rUfjOdRmTl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><p>There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science.</p><img src='http://www.scoop.it/rv?p=4038470603&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4038470603/2015/03/04/information-processing-and-dynamics-in-minimally-cognitive-agents'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Complexity and the Arrow of Time | History, philosophy and foundations of physics
http://www.scoop.it/t/information-processing-in-complex-systems/p/4036116428/2015/01/28/complexity-and-the-arrow-of-time-history-philosophy-and-foundations-of-physics
<img src='http://img.scoop.it/ACLjsUycRQEuIqb2UcQGDTl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><p>"There is a widespread assumption that the universe in general, and life in particular, is 'getting more complex with time'. This book brings together a wide range of experts in science, philosophy and theology and unveils their joint effort in exploring this idea. They confront essential problems behind the theory of complexity and the role of life within it: what is complexity? When does it increase, and why? Is the universe evolving towards states of ever greater complexity and diversity? If so, what is the source of this universal enrichment? This book addresses those difficult questions, and offers a unique cross-disciplinary perspective on some of the most profound issues at the heart of science and philosophy. Readers will gain insights in complexity that reach deep into key areas of physics, biology, complexity science, philosophy and religion."</p><br/>Rick Quax's insight:<br/>Also contains quite some clues about information-theoretical perspectives especially Davies' and Lineweaver's contributions.<img src='http://www.scoop.it/rv?p=4036116428&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4036116428/2015/01/28/complexity-and-the-arrow-of-time-history-philosophy-and-foundations-of-physics'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Transfer Entropy and Transient Limits of Computation (Scientific Reports, Nature Publishing Group)
http://www.scoop.it/t/information-processing-in-complex-systems/p/4035293939/2015/01/15/transfer-entropy-and-transient-limits-of-computation-scientific-reports-nature-publishing-group
<img src='http://img.scoop.it/VXE_jzoY3OdveNzzF60GETl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><blockquote> Transfer entropy is a recently introduced information-theoretic measure quantifying directed statistical coherence between spatiotemporal processes, and is widely used in diverse fields ranging from finance to neuroscience. However, its relationships to fundamental limits of computation, such as Landauer's limit, remain unknown. Here we show that in order to increase transfer entropy (predictability) by one bit, heat flow must match or exceed Landauer's limit. Importantly, we generalise Landauer's limit to bi-directional information dynamics for non-equilibrium processes, revealing that the limit applies to prediction, in addition to retrodiction (information erasure). Furthermore, the results are related to negentropy, and to Bremermann's limit and the Bekenstein bound, producing, perhaps surprisingly, lower bounds on the computational deceleration and information loss incurred during an increase in predictability about the process. The identified relationships set new computational limits in terms of fundamental physical quantities, and establish transfer entropy as a central measure connecting information theory, thermodynamics and theory of computation.</blockquote><img src='http://www.scoop.it/rv?p=4035293939&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4035293939/2015/01/15/transfer-entropy-and-transient-limits-of-computation-scientific-reports-nature-publishing-group'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Entropy | Special Issue : Information Processing in Complex Systems
http://www.scoop.it/t/information-processing-in-complex-systems/p/4031255399/2014/11/07/entropy-special-issue-information-processing-in-complex-systems
<img src='http://img.scoop.it/ol0QZ4RUhCvslRFGgSb2Fzl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><p>All systems in nature have one thing in common: they process information. Information is registered in the state of a system and its elements, implicitly and invisibly. As elements interact, information is transferred and modified. Indeed, bits of information about the state of one element will travel—imperfectly—to the state of the other element, forming its new state. This storage, transfer, and modification of information, possibly between levels of a multi level system, is imperfect due to randomness or noise. From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions. Mapping out exactly how these bits of information percolate through the system could reveal new fundamental insights in how the parts orchestrate to produce the properties of the system. A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with disparate semantics, would reduce to a single language of information processing.</p><p> </p><p> </p><br/>Rick Quax's insight:<br/>Submission deadline: February 28, 2015<img src='http://www.scoop.it/rv?p=4031255399&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4031255399/2014/11/07/entropy-special-issue-information-processing-in-complex-systems'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Applying Information Theory to Neuronal Networks: From Theory to Experiments
http://www.scoop.it/t/information-processing-in-complex-systems/p/4031123567/2014/11/05/applying-information-theory-to-neuronal-networks-from-theory-to-experiments
<img src='http://img.scoop.it/lr1J-_7zYQxpnp0bxspMwTl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><blockquote> Information-theory is being increasingly used to analyze complex, self-organizing processes on networks, predominantly in analytical and numerical studies. Perhaps one of the most paradigmatic complex systems is a network of neurons, in which cognition arises from the information storage, transfer, and processing among individual neurons. In this article we review experimental techniques suitable for validating information-theoretical predictions in simple neural networks, as well as generating new hypotheses. Specifically, we focus on techniques that may be used to measure both network (microcircuit) anatomy as well as neuronal activity simultaneously. This is needed to study the role of the network structure on the emergent collective dynamics, which is one of the reasons to study the characteristics of information processing. We discuss in detail two suitable techniques, namely calcium imaging and the application of multi-electrode arrays to simple neural networks in culture, and discuss their advantages and limitations in an accessible manner for non-experts. In particular, we show that each technique induces a qualitatively different type of error on the measured mutual information. The ultimate goal of this work is to bridge the gap between theorists and experimentalists in their shared goal of understanding the behavior of networks of neurons.</blockquote><img src='http://www.scoop.it/rv?p=4031123567&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4031123567/2014/11/05/applying-information-theory-to-neuronal-networks-from-theory-to-experiments'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Information-based fitness and the emergence of criticality in living systems
http://www.scoop.it/t/information-processing-in-complex-systems/p/4031058497/2014/11/04/information-based-fitness-and-the-emergence-of-criticality-in-living-systems
<img src='http://img.scoop.it/6hAIyT1DVyNTg8xl8M72Pjl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><br/>Rick Quax's insight:<br/>Please correct me if I interpret it wrong. If agent A optimizes its description of another agent B then the Kullback-Leibler divergence (between the state of B and A's representation of B) decreases. This optimization is achieved by tuning the parameters which drive the state of A. Then the Fisher information of A's representation (parameterized by B's state) naturally increases, and an increasing Fisher information is assumed to imply criticality (along the lines of 'sensitivity to parameters').<img src='http://www.scoop.it/rv?p=4031058497&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4031058497/2014/11/04/information-based-fitness-and-the-emergence-of-criticality-in-living-systems'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>JIDT: Java Information Dynamics Toolkit for studying information-theoretic measures of computation in complex systems - Google Project Hosting
http://www.scoop.it/t/information-processing-in-complex-systems/p/4026648086/2014/08/21/jidt-java-information-dynamics-toolkit-for-studying-information-theoretic-measures-of-computation-in-complex-systems-google-project-hosting
<img src='http://img.scoop.it/Sfn9UvylEbMm7OqULdNpjTl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><p>JIDT provides a stand-alone, open-source code Java implementation (usable in Matlab, Octave, Python and R of information-theoretic measures of distributed computation in complex systems: i.e. information storage, transfer and modification.</p><p> </p><p>JIDT includes implementations:</p>principally for the measures transfer entropy, mutual information, and their conditional variants, as well as active information storage, entropy, etc;for both discrete and continuous-valued data;using various types of estimators (e.g. Kraskov-Stögbauer-Grassberger estimators, box-kernel estimation, linear-Gaussian),<p>as described in full at ImplementedMeasures.</p><p> </p><p>JIDT is distributed under the GNU GPL v3 license (or later).</p><img src='http://www.scoop.it/rv?p=4026648086&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4026648086/2014/08/21/jidt-java-information-dynamics-toolkit-for-studying-information-theoretic-measures-of-computation-in-complex-systems-google-project-hosting'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Electronically erased
http://www.scoop.it/t/information-processing-in-complex-systems/p/4024837356/2014/07/18/electronically-erased
<p>The fate of Schrödinger's cat depends on the particular path of a single electron. If the electron hits the trigger, which opens a bottle of poisonous gas, then the cat dies. If the path misses the trigger, the cat lives. According to quantum mechanics, electrons do not normally follow a definite trajectory. Instead, an electron can trace both paths at the same time. As a result, Schrödinger's cat is both dead and alive. Quantum mechanics also teaches us that a “which-path” detector—that is, any measurement device that can show where the electron travels—forces the electron to choose just one path, thus sealing the cat's destiny. But what will happen after the information collected by the detector has been erased? The results of a semiconductor version of such an experiment, reported by Weisz et al. on page 1363 of this issue (1), suggest that the cat will come back to the middle ground between life and death.</p><br/>Rick Quax's insight:<br/>Erasing information restores a quantum state? Can information indeed be a fundamental concept in physics?<img src='http://www.scoop.it/rv?p=4024837356&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4024837356/2014/07/18/electronically-erased'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Information processing using a single dynamical node as complex system
http://www.scoop.it/t/information-processing-in-complex-systems/p/4024827612/2014/07/18/information-processing-using-a-single-dynamical-node-as-complex-system
<img src='http://img.scoop.it/Rj1dgRZbZVZ2qprJRvdY9Dl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><p>Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing.</p><img src='http://www.scoop.it/rv?p=4024827612&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4024827612/2014/07/18/information-processing-using-a-single-dynamical-node-as-complex-system'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>ECCS'14: Satellite Meeting INFORMATION PROCESSING IN COMPLEX SYSTEMS (IPCS'14)
http://www.scoop.it/t/information-processing-in-complex-systems/p/4024208971/2014/07/07/eccs-14-satellite-meeting-information-processing-in-complex-systems-ipcs-14
<br/>Rick Quax's insight:<br/>This satellite meeting centers especially around the viewpoint of considering ‘information’ to be a quantity that is inherently stored, transferred, and modified (e.g., integrated/synergy) in (complex) dynamical systems, or perhaps even more fundamentally, to start thinking of any dynamical process as a (Turing) computational process. This information-centered approach hopefully brings better theory and understanding for the case of complex systems, for which traditional tools have limited success. We are inviting like-minded researchers to meet at the satellite meeting “Information Processing in Complex Systems” or IPCS’14 during the upcoming ECCS’14 conference, now the 3rd edition.<br/> <br/>We aim to grow this meeting as a central forum on the subject, within which researchers now still form dispersed and fairly isolated groups, with diverse ideas.<br/> <br/>Confirmed invited speakers: Prof. dr. Kristian Lindgren, Dr. Paul Williams, and Hermann Haken will be present.<br/> <br/>Schedule is now online at <a href="http://computationalscience.nl/ipcs2014/" rel="nofollow">http://computationalscience.nl/ipcs2014/</a><img src='http://www.scoop.it/rv?p=4024208971&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4024208971/2014/07/07/eccs-14-satellite-meeting-information-processing-in-complex-systems-ipcs-14'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Entropy | Free Full-Text | Entropy in the Critical Zone: A Comprehensive Review
http://www.scoop.it/t/information-processing-in-complex-systems/p/4024209600/2014/07/07/entropy-free-full-text-entropy-in-the-critical-zone-a-comprehensive-review
<img src='http://img.scoop.it/lr1J-_7zYQxpnp0bxspMwTl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><blockquote> Thermodynamic entropy was initially proposed by Clausius in 1865. Since then it has been implemented in the analysis of different systems, and is seen as a promising concept to understand the evolution of open systems in non-equilibrium conditions. Information entropy was proposed by Shannon in 1948, and has become an important concept to measure information in different systems. Both thermodynamic entropy and information entropy have been extensively applied in different fields related to the Critical Zone, such as hydrology, ecology, pedology, and geomorphology. In this study, we review the most important applications of these concepts in those fields, including how they are calculated, and how they have been utilized to analyze different processes. We then synthesize the link between thermodynamic and information entropies in the light of energy dissipation and organizational patterns, and discuss how this link may be used to enhance the understanding of the Critical Zone.</blockquote><img src='http://www.scoop.it/rv?p=4024209600&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4024209600/2014/07/07/entropy-free-full-text-entropy-in-the-critical-zone-a-comprehensive-review'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Equitability, mutual information, and the maximal information coefficient
http://www.scoop.it/t/information-processing-in-complex-systems/p/4016929117/2014/03/03/equitability-mutual-information-and-the-maximal-information-coefficient
<img src='http://img.scoop.it/ayD_MaMJ4TqRJy1O6kjChTl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><p>How should one quantify the strength of association between two random variables without bias for relationships of a specific form? Despite its conceptual simplicity, this notion of statistical “equitability” has yet to receive a definitive mathematical formalization. Here we argue that equitability is properly formalized by a self-consistency condition closely related to Data Processing Inequality. Mutual information, a fundamental quantity in information theory, is shown to satisfy this equitability criterion. These findings are at odds with the recent work of Reshef et al. [Reshef DN, et al. (2011) Science 334(6062):1518–1524], which proposed an alternative definition of equitability and introduced a new statistic, the “maximal information coefficient” (MIC), said to satisfy equitability in contradistinction to mutual information. These conclusions, however, were supported only with limited simulation evidence, not with mathematical arguments. Upon revisiting these claims, we prove that the mathematical definition of equitability proposed by Reshef et al. cannot be satisfied by any (nontrivial) dependence measure. We also identify artifacts in the reported simulation evidence. When these artifacts are removed, estimates of mutual information are found to be more equitable than estimates of MIC. Mutual information is also observed to have consistently higher statistical power than MIC. We conclude that estimating mutual information provides a natural (and often practical) way to equitably quantify statistical associations in large datasets.</p><img src='http://www.scoop.it/rv?p=4016929117&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4016929117/2014/03/03/equitability-mutual-information-and-the-maximal-information-coefficient'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Frontiers | Local active information storage as a tool to understand distributed neural information processing | Frontiers in Neuroinformatics
http://www.scoop.it/t/information-processing-in-complex-systems/p/4016303750/2014/02/20/frontiers-local-active-information-storage-as-a-tool-to-understand-distributed-neural-information-processing-frontiers-in-neuroinformatics
<blockquote> Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for today’s digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, such definitions were given and the specific concept of local active information storage was successfully applied to the analysis and optimization of artificial neural systems. However, no attempt to measure local active information storage in neural data has been made to date. Here we measure local active information storage on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat. We show that storage reflects neural properties such as stimulus preferences and surprise upon unexpected stimulus change, and in area 18 reflects the abstract concept of an ongoing stimulus despite the locally random nature of this stimulus. We suggest that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding.</blockquote><img src='http://www.scoop.it/rv?p=4016303750&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4016303750/2014/02/20/frontiers-local-active-information-storage-as-a-tool-to-understand-distributed-neural-information-processing-frontiers-in-neuroinformatics'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Information Dynamics
http://www.scoop.it/t/information-processing-in-complex-systems/p/4016299791/2014/02/20/information-dynamics
<img src='http://img.scoop.it/yBrGapOxrGQFITPO7-vJMTl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><br/>Rick Quax's insight:<br/>A philosophy-oriented article about the concept of 'information dynamics' and how information can be 'processed'.<img src='http://www.scoop.it/rv?p=4016299791&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4016299791/2014/02/20/information-dynamics'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Guided Self-Organization: Inception (Emergence, Complexity and Computation): Mikhail Prokopenko: 9783642537332: Amazon.com: Books
http://www.scoop.it/t/information-processing-in-complex-systems/p/4014119386/2014/01/13/guided-self-organization-inception-emergence-complexity-and-computation-mikhail-prokopenko-9783642537332-amazon-com-books
<img src='http://img.scoop.it/p7iJd37HleMMzqUmh3hPDjl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><blockquote> Guided Self-Organization: Inception (Emergence, Complexity and Computation) [Mikhail Prokopenko] on Amazon.com. *FREE* shipping on qualifying offers. Is it possible to guide the process of self-organisation towards specific patterns and outcomes?</blockquote><img src='http://www.scoop.it/rv?p=4014119386&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4014119386/2014/01/13/guided-self-organization-inception-emergence-complexity-and-computation-mikhail-prokopenko-9783642537332-amazon-com-books'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>A Trade-off between Local and Distributed Information Processing Associated with Remote Episodic versus Semantic Memory
http://www.scoop.it/t/information-processing-in-complex-systems/p/4013898877/2014/01/09/a-trade-off-between-local-and-distributed-information-processing-associated-with-remote-episodic-versus-semantic-memory
<p>Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.</p><img src='http://www.scoop.it/rv?p=4013898877&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4013898877/2014/01/09/a-trade-off-between-local-and-distributed-information-processing-associated-with-remote-episodic-versus-semantic-memory'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Phys. Rev. Lett. 111, 177203 (2013): Information Flow in a Kinetic Ising Model Peaks in the Disordered Phase
http://www.scoop.it/t/information-processing-in-complex-systems/p/4012961156/2013/12/19/phys-rev-lett-111-177203-2013-information-flow-in-a-kinetic-ising-model-peaks-in-the-disordered-phase
<img src='http://img.scoop.it/Xpqse9oNLK1uw5Mtgk7-rDl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><img src='http://www.scoop.it/rv?p=4012961156&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4012961156/2013/12/19/phys-rev-lett-111-177203-2013-information-flow-in-a-kinetic-ising-model-peaks-in-the-disordered-phase'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Information Flow in a Kinetic Ising Model Peaks in the Disordered Phase
http://www.scoop.it/t/information-processing-in-complex-systems/p/4010078286/2013/10/29/information-flow-in-a-kinetic-ising-model-peaks-in-the-disordered-phase
<p>There is growing evidence that for a range of dynamical systems featuring complex interactions between large ensembles of interacting elements, mutual information peaks at order-disorder phase transitions. We conjecture that, by contrast, information flow in such systems will generally peak strictly on the disordered side of a phase transition. This conjecture is verified for a ferromagnetic 2D lattice Ising model with Glauber dynamics and a transfer entropy-based measure of systemwide information flow. Implications of the conjecture are considered, in particular, that for a complex dynamical system in the process of transitioning from disordered to ordered dynamics (a mechanism implicated, for example, in financial market crashes and the onset of some types of epileptic seizures); information dynamics may be able to predict an imminent transition.</p><p> </p><p>Lionel Barnett, Joseph T. Lizier, Michael Harré, Anil K. Seth, and Terry Bossomaier</p><p>"Information Flow in a Kinetic Ising Model Peaks in the Disordered Phase"</p><p>Physical Review Letters 111, 177203 (2013)</p><p><a href="http://link.aps.org/doi/10.1103/PhysRevLett.111.177203" rel="nofollow">http://link.aps.org/doi/10.1103/PhysRevLett.111.177203</a></p><img src='http://www.scoop.it/rv?p=4010078286&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4010078286/2013/10/29/information-flow-in-a-kinetic-ising-model-peaks-in-the-disordered-phase'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Autonomy: An information theoretic perspective
http://www.scoop.it/t/information-processing-in-complex-systems/p/4009606474/2013/10/21/autonomy-an-information-theoretic-perspective
<img src='http://img.scoop.it/3P_ekGtR-Izotr6TcXy9Mzl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><p>We present a tentative proposal for a quantitative measure of autonomy. This is something that, surprisingly, is rarely found in the literature, even though autonomy is considered to be a basic concept in many disciplines, including artificial life.</p><p> </p><p>We work in an information theoretic setting for which the distinction between system and environment is the starting point. As a first measure for autonomy, we propose the conditional mutual information between consecutive states of the system conditioned on the history of the environment. This works well when the system cannot influence the environment at all and the environment does not interact synergetically with the system. When, in contrast, the system has full control over its environment, we should instead neglect the environment history and simply take the mutual information between consecutive system states as a measure of autonomy.</p><p> </p><p>In the case of mutual interaction between system and...</p><img src='http://www.scoop.it/rv?p=4009606474&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4009606474/2013/10/21/autonomy-an-information-theoretic-perspective'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Information and closure in systems theory
http://www.scoop.it/t/information-processing-in-complex-systems/p/4009607032/2013/10/21/information-and-closure-in-systems-theory
<p>The notion of closure plays a prominent role in systems theory where it is used to identify or define the system in distinction from its environment and to explain the autonomy of the system. Here, we present a quantitative measure, as opposed to the already existing qualitative notions, of closure. We shall elaborate upon the observation that cognitive systems can achieve informational closure by modeling their environment. Formally, then, a system is informationally closed if (almost) no information flows into it from the environment. A system that is independent from its environment trivially achieves informational closure. Simulations of coupled hidden Markov models demonstrate that informational closure can also be realized non-trivially by modeling or controlling the environment. Our analysis of systems that actively influence their environment to achieve closure then reveals interesting connections to the related notion of autonomy. This discussion will then call into question the system-environment distinction that seems so innocent to begin with. It turns out that the notion of autonomy depends crucially on whether, not just the state observables, but also the dynamical processes are attributed to either the system or the environment. In that manner, our conceptualization of informational closure also sheds light on other, more ambitious notions of closure, e.g. organizational closure, semantic closure, closure to efficient cause or operational closure, intended as a fundamental (defining) concept of life itself.</p><img src='http://www.scoop.it/rv?p=4009607032&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4009607032/2013/10/21/information-and-closure-in-systems-theory'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Shared Information -- New Insights and Problems in Decomposing Information in Complex Systems
http://www.scoop.it/t/information-processing-in-complex-systems/p/4009605134/2013/10/21/shared-information-new-insights-and-problems-in-decomposing-information-in-complex-systems
<img src='http://img.scoop.it/0QKteeFhQ8jTKBqrQf11jjl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><p>How can the information that a set</p><p>of random variables contains about another random variable</p><p>be decomposed? To what extent do different subgroups provide the same, i.e. shared or redundant, information, carry unique information or interact for the emergence of synergistic information? Recently Williams and Beer proposed such a decomposition based on natural properties for shared information. While these properties fix the structure of the decomposition, they do not uniquely specify the values of the different terms. Therefore, we investigate additional properties such as strong symmetry and left monotonicity. We find that strong symmetry is incompatible with the properties proposed by Williams and Beer. Although left monotonicity is a very natural property for an information measure it is not fulfilled by any of the proposed measures. We also study a geometric framework for information decompositions and ask whether it is possible to represent shared information by a family of posterior distributions. Finally, we draw connections to the notions of shared knowledge and common knowledge in game theory. While many people believe that independent variables cannot share information, we show that in game theory independent agents can have shared knowledge, but not common knowledge. We conclude that intuition and heuristic arguments do not suffice when arguing about information.</p><img src='http://www.scoop.it/rv?p=4009605134&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4009605134/2013/10/21/shared-information-new-insights-and-problems-in-decomposing-information-in-complex-systems'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>A framework for the local information dynamics of distributed computation in complex systems
http://www.scoop.it/t/information-processing-in-complex-systems/p/4009606179/2013/10/21/a-framework-for-the-local-information-dynamics-of-distributed-computation-in-complex-systems
<p>The nature of distributed computation has often been described in terms of the component operations of universal computation: information storage, transfer and modification. We review the first complete framework that quantifies each of these individual information dynamics on a local scale within a system, and describes the manner in which they interact to create non-trivial computation where "the whole is greater than the sum of the parts". We describe the application of the framework to cellular automata, a simple yet powerful model of distributed computation. This is an important application, because the framework is the first to provide quantitative evidence for several important conjectures about distributed computation in cellular automata: that blinkers embody information storage, particles are information transfer agents, and particle collisions are information modification events. The framework is also shown to contrast the computations conducted by several well-known cellular automata, highlighting the importance of information coherence in complex computation. The results reviewed here provide important quantitative insights into the fundamental nature of distributed computation and the dynamics of complex systems, as well as impetus for the framework to be applied to the analysis and design of other systems.</p><img src='http://www.scoop.it/rv?p=4009606179&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4009606179/2013/10/21/a-framework-for-the-local-information-dynamics-of-distributed-computation-in-complex-systems'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Inferring effective computational connectivity using incrementally conditioned multivariate transfer entropy
http://www.scoop.it/t/information-processing-in-complex-systems/p/4009604417/2013/10/21/inferring-effective-computational-connectivity-using-incrementally-conditioned-multivariate-transfer-entropy
<p>Poster presentation</p><img src='http://www.scoop.it/rv?p=4009604417&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4009604417/2013/10/21/inferring-effective-computational-connectivity-using-incrementally-conditioned-multivariate-transfer-entropy'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Towards a Synergy-based Approach to Measuring Information Modification
http://www.scoop.it/t/information-processing-in-complex-systems/p/4009225089/2013/10/14/towards-a-synergy-based-approach-to-measuring-information-modification
<p>Distributed computation in artificial life and complex systems is often described in terms of component operations on information: information storage, transfer and modification. Information modification remains poorly described however, with the popularly-understood examples of glider and particle collisions in cellular automata being only quantitatively identified to date using a heuristic (separable information) rather than a proper information-theoretic measure. We outline how a recently-introduced axiomatic framework for measuring information redundancy and synergy, called partial information decomposition, can be applied to a perspective of distributed computation in order to quantify component operations on information. Using this framework, we propose a new measure of information modification that captures the intuitive understanding of information modification events as those involving interactions between two or more information sources. We also consider how the local dynamics of information modification in space and time could be measured, and suggest a new axiom that redundancy measures would need to meet in order to make such local measurements. Finally, we evaluate the potential for existing redundancy measures to meet this localizability axiom.</p><img src='http://www.scoop.it/rv?p=4009225089&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4009225089/2013/10/14/towards-a-synergy-based-approach-to-measuring-information-modification'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Information Causality as a Physical Principle
http://www.scoop.it/t/information-processing-in-complex-systems/p/4009095352/2013/10/11/information-causality-as-a-physical-principle
<img src='http://img.scoop.it/Nd2qFnacx-66GHv4DY3NXzl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><p>Nature</p><br/>Rick Quax's insight:<br/>Could also be relevant without the quantum aspect?<img src='http://www.scoop.it/rv?p=4009095352&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4009095352/2013/10/11/information-causality-as-a-physical-principle'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>Information storage, loop motifs, and clustered structure in complex networks
http://www.scoop.it/t/information-processing-in-complex-systems/p/4008218070/2013/09/25/information-storage-loop-motifs-and-clustered-structure-in-complex-networks
<p>We use a standard discrete-time linear Gaussian model to analyze the information storage capability of individual nodes in complex networks, given the network structure and link weights. In particular, we investigate the role of two- and three-node motifs in contributing to local information storage. We show analytically that directed feedback and feedforward loop motifs are the dominant contributors to information storage capability, with their weighted motif counts locally positively correlated to storage capability. We also reveal the direct local relationship between clustering coefficient(s) and information storage. These results explain the dynamical importance of clustered structure and offer an explanation for the prevalence of these motifs in biological and artificial networks.</p><p> </p><p>Information storage, loop motifs, and clustered structure in complex networks</p><p>Joseph T. Lizier, Fatihcan M. Atay and Jürgen Jost</p><p>Phys. Rev. E 86, 026110 (2012)</p><p><a href="http://dx.doi.org/10.1103/PhysRevE.86.026110" rel="nofollow">http://dx.doi.org/10.1103/PhysRevE.86.026110</a></p><p> </p><img src='http://www.scoop.it/rv?p=4008218070&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4008218070/2013/09/25/information-storage-loop-motifs-and-clustered-structure-in-complex-networks'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>The Local Information Dynamics of Distributed Computation in Complex Systems
http://www.scoop.it/t/information-processing-in-complex-systems/p/4008217174/2013/09/25/the-local-information-dynamics-of-distributed-computation-in-complex-systems
<img src='http://img.scoop.it/QXpuFugp-m3el9xLU-90Rzl72eJkfbmt4t8yenImKBV9ip2J1EIeUzA9paTSgKmv' /><br/><p>The nature of distributed computation in complex systems has often been described in terms of memory, communication and processing. This thesis presents a complete information-theoretic framework to quantify these operations on information (i.e. information storage, transfer and modification), and in particular their dynamics in space and time. The framework is applied to cellular automata, and delivers important insights into the fundamental nature of distributed computation and the dynamics of complex systems (e.g. that gliders are dominant information transfer agents). Applications to several important network models, including random Boolean networks, suggest that the capability for information storage and coherent transfer are maximised near the critical regime in certain order-chaos phase transitions. Further applications to study and design information structure in the contexts of computational neuroscience and guided self-organisation underline the practical utility of the techniques presented here.</p><p> </p><p> </p><p>"The Local Information Dynamics of Distributed Computation in Complex Systems"</p><p>Joseph T. Lizier</p><p>(With foreword by Dr. Mikhail Prokopenko)</p><p>Springer Theses, Springer: Berlin/Heidelberg, 2013.</p><p><a href="http://dx.doi.org/10.1007/978-3-642-32952-4" rel="nofollow">http://dx.doi.org/10.1007/978-3-642-32952-4</a></p><p> </p><img src='http://www.scoop.it/rv?p=4008217174&tp=Topic'/><br /><br /><div ><a href='http://www.scoop.it/t/information-processing-in-complex-systems/p/4008217174/2013/09/25/the-local-information-dynamics-of-distributed-computation-in-complex-systems'>See it on Scoop.it</a>, via <a href='http://www.scoop.it/t/information-processing-in-complex-systems'>Information Processing in Complex Systems</a></div><div style='clear: both'></div>