Complex systems present problems both in mathematical modelling and philosophical foundations. The study of complex systems represents a new approach to science that investigates how relationships between parts give rise to the collective behaviors of a system and how the system interacts and forms relationships with its environment. The equations from which models of complex systems are developed generally derive from statistical physics, information theory and non-linear dynamics, and represent organized but unpredictable behaviors of natural systems that are considered fundamentally complex.
Long term behavior of nonlinear deterministic continuous time signals can be studied in terms of their reconstructed attractors. Reconstructed attractors of a continuous signal are meant to be topologically equivalent representations of the dynamics of the unknown dynamical system which generates the signal. Sometimes, geometry of the attractor or its complexity may give important information on the system of interest. However, if the trajectories of the attractor behave as if they are not coming from continuous system or there exists many spike like structures on the path of the system trajectories, then there is no way to characterize the shape of the attractor. In this article, the traditional attractor reconstruction method is first used for two types of ECG signals: Normal healthy persons (NHP) and Congestive Heart failure patients (CHFP). As common in such a framework, the reconstructed attractors are not at all well formed and hence it is not possible to adequately characterize their geometrical features. Thus, we incorporate frequency domain information to the given time signals. This is done by transforming the signals to a time frequency domain by means of suitable Wavelet transforms (WT). The transformed signal concerns two non homogeneous variables and is still quite difficult to use to reconstruct some dynamics out of it. By applying a suitable mapping, this signal is further converted into integer domain and a new type of 3D plot, called integer lag plot, which characterizes and distinguishes the ECG signals of NHP and CHFP, is finally obtained.
Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network “mobile” can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed.
Controlling extreme events on complex networks • Yu-Zhong Chen, Zi-Gang Huang & Ying-Cheng Lai
Complex systems are increasingly being viewed as distributed information processing systems, particularly in the domains of computational neuroscience, bioinformatics and Artificial Life. This trend has resulted in a strong uptake in the use of (Shannon) information-theoretic measures to analyse the dynamics of complex systems in these fields. We introduce the Java Information Dynamics Toolkit (JIDT): a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data. While the toolkit provides classic information-theoretic measures (e.g. entropy, mutual information, conditional mutual information), it ultimately focusses on implementing higher-level measures for information dynamics. That is, JIDT focusses on quantifying information storage, transfer and modification, and the dynamics of these operations in space and time. For this purpose, it includes implementations of the transfer entropy and active information storage, their multivariate extensions and local or pointwise variants. JIDT provides implementations for both discrete and continuous-valued data for each measure, including various types of estimator for continuous data (e.g. Gaussian, box-kernel and Kraskov-Stoegbauer-Grassberger) which can be swapped at run-time due to Java's object-oriented polymorphism. Furthermore, while written in Java, the toolkit can be used directly in MATLAB, GNU Octave and Python. We present the principles behind the code design, and provide several examples to guide users
"JIDT: An information-theoretic toolkit for studying the dynamics of complex systems" Joseph T. Lizier, arXiv:1408.3270, 2014 http://arxiv.org/abs/1408.3270
The idea is advanced that self-organization in complex systems can be treated as decision making (as it is performed by humans) and, vice versa, decision making is nothing but a kind of self-organization in the decision maker nervous systems. A mathematical formulation is suggested based on the definition of probabilities of system states, whose particular cases characterize the probabilities of structures, patterns, scenarios, or prospects. In this general framework, it is shown that the mathematical structures of self-organization and of decision making are identical. This makes it clear how self-organization can be seen as an endogenous decision making process and, reciprocally, decision making occurs via an endogenous self-organization. The approach is illustrated by phase transitions in large statistical systems, crossovers in small statistical systems, evolutions and revolutions in social and biological systems, structural self-organization in dynamical systems, and by the probabilistic formulation of classical and behavioral decision theories. In all these cases, self-organization is described as the process of evaluating the probabilities of macroscopic states or prospects in the search for a state with the largest probability. The general way of deriving the probability measure for classical systems is the principle of minimal information, that is, the conditional entropy maximization under given constraints. Behavioral biases of decision makers can be characterized in the same way as analogous to quantum fluctuations in natural systems
We show that dynamical systems with spatial degrees of freedom naturally evolve into a self-organized critical point. Flicker noise, or 1/f noise, can be identified with the dynamics of the critical state. This picture also yields insight into the origin of fractal objects.
The character of the time-asymptotic evolution of physical systems can have complex, singular behavior with variation of a system parameter, particularly when chaos is involved. A perturbation of the parameter by a small amount ϵ can convert an attractor from chaotic to non-chaotic or vice-versa. We call a parameter value where this can happen ϵ-uncertain. The probability that a random choice of the parameter is ϵ-uncertain commonly scales like a power law in ϵ. Surprisingly, two seemingly similar ways of defining this scaling, both of physical interest, yield different numerical values for the scaling exponent. We show why this happens and present a quantitative analysis of this phenomenon.
We study the two particle annihilation reaction A+B->Ø A+B→∅ on interconnected scale free networks. We show that the mixing of particles and the evolution of the process are influenced by the number of interconnecting links and by their functional properties, while surprisingly when the interconnecting links have the same function as the links within the networks, they are not affected by the interconnectivity strategies in use. Due to the better mixing, which suppresses the segregation effect, we show that the reaction rates are faster than what was observed in other topologies, in-line with previous studies performed on single scale free networks.
To further advance our understanding of the brain, new concepts and theories are needed. In particular, the ability of the brain to create information flows must be reconciled with its propensity for synchronization and mass action. The theoretical and empirical framework of Coordination Dynamics, a key aspect of which is metastability, are presented as a starting point to study the interplay of integrative and segregative tendencies that are expressed in space and time during the normal course of brain and behavioral function. Some recent shifts in perspective are emphasized, that may ultimately lead to a better understanding of brain complexity.
Signal analysis is one of the finest scientific techniques in communication theory. Some quantitative and qualitative measures describe the pattern of a music signal, vary from one to another. Same musical recital, when played by different instrumentalists, generates different types of music patterns. The reason behind various patterns is the psychoacoustic measures - Dynamics, Timber, Tonality and Rhythm, varies in each time. However, the psycho-acoustic study of the music signals does not reveal any idea about the similarity between the signals. For such cases, study of synchronization of long-term nonlinear dynamics may provide effective results. In this context, phase synchronization (PS) is one of the measures to show synchronization between two non-identical signals. In fact, it is very critical to investigate any other kind of synchronization for experimental condition, because those are completely non identical signals. Also, there exists equivalence between the phases and the distances of the diagonal line in Recurrence plot (RP) of the signals, which is quantifiable by the recurrence quantification measure tau-recurrence rate. This paper considers two nonlinear music signals based on same raga played by two eminent sitar instrumentalists as two non-identical sources. The psycho-acoustic study shows how the Dynamics, Timber, Tonality and Rhythm vary for the two music signals. Then, long term analysis in the form of phase space reconstruction is performed, which reveals the chaotic phase spaces for both the signals. From the RP of both the phase spaces, tau-recurrence rate is calculated. Finally by the correlation of normalized tau-recurrence rate of their 3D phase spaces and the PS of the two music signals has been established. The numerical results well support the analysis.
Measures of nonlinearity and complexity, and in particular the study of Lyapunov exponents, have been increasingly used to characterize dynamical properties of a wide range of biological nonlinear systems, including cardiovascular control. In this work, we present a novel methodology able to effectively estimate the Lyapunov spectrum of a series of stochastic events in an instantaneous fashion. The paradigm relies on a novel point-process high-order nonlinear model of the event series dynamics. The long-term information is taken into account by expanding the linear, quadratic, and cubic Wiener-Volterra kernels with the orthonormal Laguerre basis functions. Applications to synthetic data such as the Hénon map and Rössler attractor, as well as two experimental heartbeat interval datasets (i.e., healthy subjects undergoing postural changes and patients with severe cardiac heart failure), focus on estimation and tracking of the Instantaneous Dominant Lyapunov Exponent (IDLE). The novel cardiovascular assessment demonstrates that our method is able to effectively and instantaneously track the nonlinear autonomic control dynamics, allowing for complexity variability estimations.
Herding of sheep by dogs is a powerful example of one individual causing many unwilling individuals to move in the same direction. Similar phenomena are central to crowd control, cleaning the environment and other engineering problems. Despite single dogs solving this ‘shepherding problem’ every day, it remains unknown which algorithm they employ or whether a general algorithm exists for shepherding. Here, we demonstrate such an algorithm, based on adaptive switching between collecting the agents when they are too dispersed and driving them once they are aggregated. Our algorithm reproduces key features of empirical data collected from sheep–dog interactions and suggests new ways in which robots can be designed to influence movements of living and artificial agents.
Complexity science has proliferated across academic domains in recent years. A question arises as to whether any useful sense of ‘generalized complexity’ can be abstracted from the various versions of complexity to be found in the literature, and whether it could prove fruitful in a scientific sense. Most attempts at defining complexity center around two kinds of notions: Structural, and temporal or dynamic. Neither of these is able to provide a foundation for the intuitive or generalized notion when taken separately; structure is often a derivative notion, dependent on prior notions of complexity, and dynamic notions such as entropy are often indefinable. The philosophical notion of process may throw light on the tensions and contradictions within complexity. Robustness, for instance, a key quality of complexity, is quite naturally understood within a process-theoretical framework. Understanding complexity as process also helps one align complexity science with holistically oriented predecessors such as General System Theory, while allowing for the reductionist perspective of complexity. These results, however, have the further implication that it may be futile to search for general laws of complexity, or to hope that investigations of complex objects in one domain may throw light on complexity in unrelated domains.
Using the effective complexity measure, proposed by M. Gell-Mann and S. Lloyd, we give a quantitative definition of an emergent property. We use several previous results and properties of this particular information measure closely related to the random features of the entity and its regularities.
Complexity and the Emergence of Physical Properties Miguel Angel Fuentes
Relatively recent work has reported that networks of neurons can produce avalanches of activity whose sizes follow a power law distribution. This suggests that these networks may be operating near a critical point, poised between a phase where activity rapidly dies out and a phase where activity is amplified over time. The hypothesis that the electrical activity of neural networks in the brain is critical is potentially important, as many simulations suggest that information processing functions would be optimized at the critical point. This hypothesis, however, is still controversial. Here we will explain the concept of criticality and review the substantial objections to the criticality hypothesis raised by skeptics. Points and counter points are presented in dialogue form.
The concept of stigmergy has been used to analyze self-organizing activities in an ever-widening range of domains, from social insects via robotics and social media to human society. Yet, it is still poorly understood, and as such its full power remains underappreciated. The present paper clarifies the issue by defining stigmergy as a mechanism of indirect coordination in which the trace left by an action in a medium stimulates a subsequent action. It then analyses the fundamental components of the definition: action, agent, medium, trace and coordination. Stigmergy enables complex, coordinated activity without any need for planning, control, communication, simultaneous presence, or even mutual awareness. This makes the concept applicable to a very broad variety of cases, from chemical reactions to individual cognition and Internet-supported collaboration in Wikipedia. The paper classifies different varieties of stigmergy according to general aspects (number of agents, scope, persistence, sematectonic vs. marker-based, and quantitative vs. qualitative), while emphasizing the fundamental continuity between these cases. This continuity can be understood from a non-linear, self-organizing dynamic that lets more complex forms of coordination evolve out of simpler ones. The paper concludes with two specifically human applications in cognition and cooperation, suggesting that without stigmergy these phenomena may never have evolved.
Heylighen, F. (2015). Stigmergy as a Universal Coordination Mechanism: components, varieties and applications. To appear in T. Lewis & L. Marsh (Eds.), Human Stigmergy: Theoretical Developments and New Applications, Studies in Applied Philosophy, Epistemology and Rational Ethics. Springer. http://pespmc1.vub.ac.be/papers/stigmergy-varieties.pdf
We present an optimization process to estimate parameters in systems of ordinary differential equations from chaotic time series. The optimization technique is based on a variational approach, and numerical studies on noisy time series demonstrate that it is very robust and appropriate to reduce the complexity of the model. The proposed process also allows to discard the parameters with scanty influence on the dynamic.
We present theoretical and empirical results demonstrating the usefulness of voting rules for participatory democracies. We first give algorithms which efficiently elicit \epsilon-approximations to two prominent voting rules: the Borda rule and the Condorcet winner. This result circumvents previous prohibitive lower bounds and is surprisingly strong: even if the number of ideas is as large as the number of participants, each participant will only have to make a logarithmic number of comparisons, an exponential improvement over the linear number of comparisons previously needed. We demonstrate the approach in an experiment in Finland's recent off-road traffic law reform, observing that the total number of comparisons needed to achieve a fixed \epsilon approximation is linear in the number of ideas and that the constant is not large. Finally, we note a few other experimental observations which support the use of voting rules for aggregation. First, we observe that rating, one of the common alternatives to ranking, manifested effects of bias in our data. Second, we show that very few of the topics lacked a Condorcet winner, one of the prominent negative results in voting. Finally, we show data hinting at a potential future direction: the use of partial rankings as opposed to pairwise comparisons to further decrease the elicitation time.