Dynamical Structures in Complex Systems
119 views | +0 today
Follow
Your new post is loading...
Your new post is loading...
Scooped by Alessandro Filisetti
Scoop.it!

An Information-Theoretic Formalism for Multiscale Structure in Complex Systems | NECSI

An Information-Theoretic Formalism for Multiscale Structure in Complex Systems | NECSI | Dynamical Structures in Complex Systems | Scoop.it
more...
No comment yet.
Scooped by Alessandro Filisetti
Scoop.it!

[1405.0126] Is Consciousness Computable? Quantifying Integrated Information Using Algorithmic Information Theory

In this article we review Tononi's (2008) theory of consciousness as integrated information. We argue that previous formalizations of integrated information (e.g. Griffith, 2014) depend on information loss. Since lossy integration would necessitate continuous damage to existing memories, we propose it is more natural to frame consciousness as a lossless integrative process and provide a formalization of this idea using algorithmic information theory. We prove that complete lossless integration requires noncomputable functions. This result implies that if unitary consciousness exists, it cannot be modelled computationally.

more...
No comment yet.
Scooped by Alessandro Filisetti
Scoop.it!

The detection of intermediate-level emergent structures and patterns

The detection of intermediate-level emergent structures and patterns | Dynamical Structures in Complex Systems | Scoop.it

Artificial life is largely concerned with systems that exhibit different emergent phenomena; yet, the identification of emergent structures is frequently a difficult challenge. In this paper we introduced a system to identify candidate emergent mesolevel dynamical structures in dynamical networks. This method is based on an extension of a measure introduced for detecting clusters in biological neural networks; its main novelty in comparison to previous application of similar measures is that we used it to consider truly dynamical networks, and not only fluctuations around stable asymptotic states. The identified structures are clusters of elements that behave in a coherent and coordinated way and that loosely interact with the remainder of the system. We have evidence that our approach is able to identify these "emerging things" in some artificial network models and in more complex data coming from catalytic reaction networks and biological gene regulatory systems (A.thaliana). We think that this system could suggest interesting new ways in dealing with artificial and biological systems.

more...
No comment yet.
Scooped by Alessandro Filisetti
Scoop.it!

Integrated information theory - Wikipedia, the free encyclopedia

Integrated information theory

Integrated information theory is a theoretical framework for attempting to understand and explain the nature of consciousness. It was developed by psychiatrist and neuroscientist Giulio Tononi of the University of Wisconsin-Madison. The theory is based on two key propositions. The first is that every observable conscious state contains a massive amount of information.

Integrated information theory http://t.co/Z7jnkQCnrm -the right multidimensional function can describe all the frames in a compressed manner
Alessandro Filisetti's insight:

Here the wikipedia page introducing the integrated information theory

more...
No comment yet.
Rescooped by Alessandro Filisetti from Papers
Scoop.it!

Transfer Entropy and Transient Limits of Computation

Transfer entropy is a recently introduced information-theoretic measure quantifying directed statistical coherence between spatiotemporal processes, and is widely used in diverse fields ranging from finance to neuroscience. However, its relationships to fundamental limits of computation, such as Landauer's limit, remain unknown. Here we show that in order to increase transfer entropy (predictability) by one bit, heat flow must match or exceed Landauer's limit. Importantly, we generalise Landauer's limit to bi-directional information dynamics for non-equilibrium processes, revealing that the limit applies to prediction, in addition to retrodiction (information erasure). Furthermore, the results are related to negentropy, and to Bremermann's limit and the Bekenstein bound, producing, perhaps surprisingly, lower bounds on the computational deceleration and information loss incurred during an increase in predictability about the process. The identified relationships set new computational limits in terms of fundamental physical quantities, and establish transfer entropy as a central measure connecting information theory, thermodynamics and theory of computation.

 

Transfer Entropy and Transient Limits of Computation
Mikhail Prokopenko and Joseph T. Lizier
Scientific Reports 4, 5394, doi:10.1038/srep05394
http://www.nature.com/srep/2014/140623/srep05394/full/srep05394.html


Via Complexity Digest
more...
Colbert Sesanker's curator insight, August 30, 2014 10:40 PM

combine with integrated information

Scooped by Alessandro Filisetti
Scoop.it!

Dynamical Structures in Complex Systems: an information theoretic perspective (DySCS)

Dynamical Structures in Complex Systems: an information theoretic perspective (DySCS) | Dynamical Structures in Complex Systems | Scoop.it
Parallel Event at the European Conference on Complex Systems 2014 DATE: 25 September  Venue: IMT Institute for Advances Studies Lucca Campus Information theory can provide powerful tools for studyi...
more...
No comment yet.
Scooped by Alessandro Filisetti
Scoop.it!

From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0

From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0 | Dynamical Structures in Complex Systems | Scoop.it
PLOS Computational Biology is an open-access (IITの進化版。ついに出たのか。From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0 http://t.co/zFN9VUSDJF)...
more...
No comment yet.