Complex systems present problems both in mathematical modelling and philosophical foundations. The study of complex systems represents a new approach to science that investigates how relationships between parts give rise to the collective behaviors of a system and how the system interacts and forms relationships with its environment. The equations from which models of complex systems are developed generally derive from statistical physics, information theory and non-linear dynamics, and represent organized but unpredictable behaviors of natural systems that are considered fundamentally complex.
Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network “mobile” can effectively suppress extreme events. A striking, resonance-like phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed.
Controlling extreme events on complex networks • Yu-Zhong Chen, Zi-Gang Huang & Ying-Cheng Lai
Complex systems are increasingly being viewed as distributed information processing systems, particularly in the domains of computational neuroscience, bioinformatics and Artificial Life. This trend has resulted in a strong uptake in the use of (Shannon) information-theoretic measures to analyse the dynamics of complex systems in these fields. We introduce the Java Information Dynamics Toolkit (JIDT): a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data. While the toolkit provides classic information-theoretic measures (e.g. entropy, mutual information, conditional mutual information), it ultimately focusses on implementing higher-level measures for information dynamics. That is, JIDT focusses on quantifying information storage, transfer and modification, and the dynamics of these operations in space and time. For this purpose, it includes implementations of the transfer entropy and active information storage, their multivariate extensions and local or pointwise variants. JIDT provides implementations for both discrete and continuous-valued data for each measure, including various types of estimator for continuous data (e.g. Gaussian, box-kernel and Kraskov-Stoegbauer-Grassberger) which can be swapped at run-time due to Java's object-oriented polymorphism. Furthermore, while written in Java, the toolkit can be used directly in MATLAB, GNU Octave and Python. We present the principles behind the code design, and provide several examples to guide users
"JIDT: An information-theoretic toolkit for studying the dynamics of complex systems" Joseph T. Lizier, arXiv:1408.3270, 2014 http://arxiv.org/abs/1408.3270
The idea is advanced that self-organization in complex systems can be treated as decision making (as it is performed by humans) and, vice versa, decision making is nothing but a kind of self-organization in the decision maker nervous systems. A mathematical formulation is suggested based on the definition of probabilities of system states, whose particular cases characterize the probabilities of structures, patterns, scenarios, or prospects. In this general framework, it is shown that the mathematical structures of self-organization and of decision making are identical. This makes it clear how self-organization can be seen as an endogenous decision making process and, reciprocally, decision making occurs via an endogenous self-organization. The approach is illustrated by phase transitions in large statistical systems, crossovers in small statistical systems, evolutions and revolutions in social and biological systems, structural self-organization in dynamical systems, and by the probabilistic formulation of classical and behavioral decision theories. In all these cases, self-organization is described as the process of evaluating the probabilities of macroscopic states or prospects in the search for a state with the largest probability. The general way of deriving the probability measure for classical systems is the principle of minimal information, that is, the conditional entropy maximization under given constraints. Behavioral biases of decision makers can be characterized in the same way as analogous to quantum fluctuations in natural systems
We show that dynamical systems with spatial degrees of freedom naturally evolve into a self-organized critical point. Flicker noise, or 1/f noise, can be identified with the dynamics of the critical state. This picture also yields insight into the origin of fractal objects.
The character of the time-asymptotic evolution of physical systems can have complex, singular behavior with variation of a system parameter, particularly when chaos is involved. A perturbation of the parameter by a small amount ϵ can convert an attractor from chaotic to non-chaotic or vice-versa. We call a parameter value where this can happen ϵ-uncertain. The probability that a random choice of the parameter is ϵ-uncertain commonly scales like a power law in ϵ. Surprisingly, two seemingly similar ways of defining this scaling, both of physical interest, yield different numerical values for the scaling exponent. We show why this happens and present a quantitative analysis of this phenomenon.
We study the two particle annihilation reaction A+B->Ø A+B→∅ on interconnected scale free networks. We show that the mixing of particles and the evolution of the process are influenced by the number of interconnecting links and by their functional properties, while surprisingly when the interconnecting links have the same function as the links within the networks, they are not affected by the interconnectivity strategies in use. Due to the better mixing, which suppresses the segregation effect, we show that the reaction rates are faster than what was observed in other topologies, in-line with previous studies performed on single scale free networks.
To further advance our understanding of the brain, new concepts and theories are needed. In particular, the ability of the brain to create information flows must be reconciled with its propensity for synchronization and mass action. The theoretical and empirical framework of Coordination Dynamics, a key aspect of which is metastability, are presented as a starting point to study the interplay of integrative and segregative tendencies that are expressed in space and time during the normal course of brain and behavioral function. Some recent shifts in perspective are emphasized, that may ultimately lead to a better understanding of brain complexity.
Understanding the assembly of ecosystems to estimate the number of species at different spatial scales is a challenging problem. Until now, maximum entropy approaches have lacked the important feature of considering space in an explicit manner. We propose a spatially explicit maximum entropy model suitable to describe spatial patterns such as the species area relationship and the endemic area relationship. Starting from the minimal information extracted from presence/absence data, we compare the behavior of two models considering the occurrence or lack thereof of each species and information on spatial correlations. Our approach uses the information at shorter spatial scales to infer the spatial organization at larger ones. We also hypothesize a possible ecological interpretation of the effective interaction we use to characterize spatial clustering.
The size of cities is known to play a fundamental role in social and economic life. Yet, its relation to the structure of the underlying network of human interactions has not been investigated empirically in detail. In this paper, we map society-wide communication networks to the urban areas of two European countries. We show that both the total number of contacts and the total communication activity grow superlinearly with city population size, according to well-defined scaling relations and resulting from a multiplicative increase that affects most citizens. Perhaps surprisingly, however, the probability that an individual's contacts are also connected with each other remains largely unaffected. These empirical results predict a systematic and scale-invariant acceleration of interaction-based spreading phenomena as cities get bigger, which is numerically confirmed by applying epidemiological models to the studied networks. Our findings should provide a microscopic basis towards understanding the superlinear increase of different socioeconomic quantities with city size, that applies to almost all urban systems and includes, for instance, the creation of new inventions or the prevalence of certain contagious diseases.
Markus Schläpfer, Luís M. A. Bettencourt, Sébastian Grauwin, Mathias Raschke, Rob Claxton, Zbigniew Smoreda, Geoffrey B. West, and Carlo Ratti The scaling of human interactions with city size J. R. Soc. Interface. 2014 11 20130789; http://dx.doi.org/10.1098/rsif.2013.0789
We present a general formalism for computing Lyapunov exponents and their fluctuations in spatially extended systems described by diffusive fluctuating hydrodynamics, thus extending the concepts of dynamical system theory to a broad range of non-equilibrium systems. Our analytical results compare favorably with simulations of a lattice model of heat conduction. We further show how the computation of Lyapunov exponents for the Symmetric Simple Exclusion Process relates to damage spreading and to a two-species pair annihilation process, for which our formalism yields new finite size results.
Herding of sheep by dogs is a powerful example of one individual causing many unwilling individuals to move in the same direction. Similar phenomena are central to crowd control, cleaning the environment and other engineering problems. Despite single dogs solving this ‘shepherding problem’ every day, it remains unknown which algorithm they employ or whether a general algorithm exists for shepherding. Here, we demonstrate such an algorithm, based on adaptive switching between collecting the agents when they are too dispersed and driving them once they are aggregated. Our algorithm reproduces key features of empirical data collected from sheep–dog interactions and suggests new ways in which robots can be designed to influence movements of living and artificial agents.
Complexity science has proliferated across academic domains in recent years. A question arises as to whether any useful sense of ‘generalized complexity’ can be abstracted from the various versions of complexity to be found in the literature, and whether it could prove fruitful in a scientific sense. Most attempts at defining complexity center around two kinds of notions: Structural, and temporal or dynamic. Neither of these is able to provide a foundation for the intuitive or generalized notion when taken separately; structure is often a derivative notion, dependent on prior notions of complexity, and dynamic notions such as entropy are often indefinable. The philosophical notion of process may throw light on the tensions and contradictions within complexity. Robustness, for instance, a key quality of complexity, is quite naturally understood within a process-theoretical framework. Understanding complexity as process also helps one align complexity science with holistically oriented predecessors such as General System Theory, while allowing for the reductionist perspective of complexity. These results, however, have the further implication that it may be futile to search for general laws of complexity, or to hope that investigations of complex objects in one domain may throw light on complexity in unrelated domains.
Using the effective complexity measure, proposed by M. Gell-Mann and S. Lloyd, we give a quantitative definition of an emergent property. We use several previous results and properties of this particular information measure closely related to the random features of the entity and its regularities.
Complexity and the Emergence of Physical Properties Miguel Angel Fuentes
Relatively recent work has reported that networks of neurons can produce avalanches of activity whose sizes follow a power law distribution. This suggests that these networks may be operating near a critical point, poised between a phase where activity rapidly dies out and a phase where activity is amplified over time. The hypothesis that the electrical activity of neural networks in the brain is critical is potentially important, as many simulations suggest that information processing functions would be optimized at the critical point. This hypothesis, however, is still controversial. Here we will explain the concept of criticality and review the substantial objections to the criticality hypothesis raised by skeptics. Points and counter points are presented in dialogue form.
The concept of stigmergy has been used to analyze self-organizing activities in an ever-widening range of domains, from social insects via robotics and social media to human society. Yet, it is still poorly understood, and as such its full power remains underappreciated. The present paper clarifies the issue by defining stigmergy as a mechanism of indirect coordination in which the trace left by an action in a medium stimulates a subsequent action. It then analyses the fundamental components of the definition: action, agent, medium, trace and coordination. Stigmergy enables complex, coordinated activity without any need for planning, control, communication, simultaneous presence, or even mutual awareness. This makes the concept applicable to a very broad variety of cases, from chemical reactions to individual cognition and Internet-supported collaboration in Wikipedia. The paper classifies different varieties of stigmergy according to general aspects (number of agents, scope, persistence, sematectonic vs. marker-based, and quantitative vs. qualitative), while emphasizing the fundamental continuity between these cases. This continuity can be understood from a non-linear, self-organizing dynamic that lets more complex forms of coordination evolve out of simpler ones. The paper concludes with two specifically human applications in cognition and cooperation, suggesting that without stigmergy these phenomena may never have evolved.
Heylighen, F. (2015). Stigmergy as a Universal Coordination Mechanism: components, varieties and applications. To appear in T. Lewis & L. Marsh (Eds.), Human Stigmergy: Theoretical Developments and New Applications, Studies in Applied Philosophy, Epistemology and Rational Ethics. Springer. http://pespmc1.vub.ac.be/papers/stigmergy-varieties.pdf
We present an optimization process to estimate parameters in systems of ordinary differential equations from chaotic time series. The optimization technique is based on a variational approach, and numerical studies on noisy time series demonstrate that it is very robust and appropriate to reduce the complexity of the model. The proposed process also allows to discard the parameters with scanty influence on the dynamic.
We present theoretical and empirical results demonstrating the usefulness of voting rules for participatory democracies. We first give algorithms which efficiently elicit \epsilon-approximations to two prominent voting rules: the Borda rule and the Condorcet winner. This result circumvents previous prohibitive lower bounds and is surprisingly strong: even if the number of ideas is as large as the number of participants, each participant will only have to make a logarithmic number of comparisons, an exponential improvement over the linear number of comparisons previously needed. We demonstrate the approach in an experiment in Finland's recent off-road traffic law reform, observing that the total number of comparisons needed to achieve a fixed \epsilon approximation is linear in the number of ideas and that the constant is not large. Finally, we note a few other experimental observations which support the use of voting rules for aggregation. First, we observe that rating, one of the common alternatives to ranking, manifested effects of bias in our data. Second, we show that very few of the topics lacked a Condorcet winner, one of the prominent negative results in voting. Finally, we show data hinting at a potential future direction: the use of partial rankings as opposed to pairwise comparisons to further decrease the elicitation time.
Pierre Bourdieu discussed how an individual's taste relates o his or her social environment, and how the classification of distinct and vulgar among others, arises from at the same time as shapes this taste in his work called La Distinction. Robert Axelrod created a computational model with local convergence and global polarization properties to describe the dissemination of culture by simple selective interactions. In this letter, Axelrod model is modified, while holding to the same original principles, to describe Bourdieu theory. This allows to analyze how the dynamics of society's tastes and trends may vary with a simple approach, considering social structures and to understand which social forces are crucial to change dynamics. Despite the relative simplicity, the present approach clarifies symbolic power relations, a relevant issue for understanding power relation both on large as well as on small and localized scale, with impact on activities ranging from daily life matters to business, politics, and research. This model sheds light on social issues, showing that a small amount of conflict within a class plays a central role in the culture dynamics, being the major responsible for continuous changes in distinction paradigms.
One can imagine life evolving again and again, crashing on the rocks of time and circumstance, until finally it hit upon just the right mutation rate—one that eons later would produce organisms and species and ecosystems.