Your new post is loading...
Your new post is loading...
Richard Cook Royal Institute of Technology, Stockholm Dr. Richard Cook is the Professor of Healthcare Systems Safety and Chairman of the Department of Patien...
Novelties are a familiar part of daily life. They are also fundamental to the evolution of biological systems, human society, and technology. By opening new possibilities, one novelty can pave the way for others in a process that Kauffman has called “expanding the adjacent possible”. The dynamics of correlated novelties, however, have yet to be quantified empirically or modeled mathematically. Here we propose a simple mathematical model that mimics the process of exploring a physical, biological, or conceptual space that enlarges whenever a novelty occurs. The model, a generalization of Polya's urn, predicts statistical laws for the rate at which novelties happen (Heaps' law) and for the probability distribution on the space explored (Zipf's law), as well as signatures of the process by which one novelty sets the stage for another. We test these predictions on four data sets of human activity: the edit events of Wikipedia pages, the emergence of tags in annotation systems, the sequence of words in texts, and listening to new songs in online music catalogues. By quantifying the dynamics of correlated novelties, our results provide a starting point for a deeper understanding of the adjacent possible and its role in biological, cultural, and technological evolution.
It has been nearly 25 years since the publication of Infectious Disease of Humans, the “vade mecum” of mathematical modeling of infectious disease; the proliferation of epidemiological careers that it initiated is now in its fourth generation. Epidemiological models have proved very powerful in shaping health policy discussions. The complex interactions that lead to pathogen (and pest) outbreaks make it necessary to use models to provide quantitative insights into the counterintuitive outcomes that are the rule of most nonlinear systems. Thus, epidemic models are most interesting when they suggest unexpected outcomes; they are most powerful when they describe the conditions that delineate the worstcase unexpected scenario, and provide a framework in which to compare alternative control strategies. But what are the limits of mathematical models and what kinds provide insight into emerging disease? Mathematical models for emerging disease Andy Dobson Science 12 December 2014: Vol. 346 no. 6215 pp. 12941295 http://dx.doi.org/10.1126/science.aaa3441
Via Complexity Digest
We consider biological individuality in terms of information theoretic and graphical principles. Our purpose is to extract through an algorithmic decomposition systemenvironment boundaries supporting individuality. We infer or detect evolved individuals rather than assume that they exist. Given a set of consistent measurements over time, we discover a coarsegrained or quantized description on a system, inducing partitions (which can be nested). Legitimate individual partitions will propagate information from the past into the future, whereas spurious aggregations will not. Individuals are therefore defined in terms of ongoing, bounded information processing units rather than lists of static features or conventional replicationbased definitions which tend to fail in the case of cultural change. One virtue of this approach is that it could expand the scope of what we consider adaptive or biological phenomena, particularly in the microscopic and macroscopic regimes of molecular and social phenomena. The Information Theory of Individuality David Krakauer, Nils Bertschinger, Eckehard Olbrich, Nihat Ay, Jessica C. Flack http://arxiv.org/abs/1412.2447
Via Complexity Digest
This is the first in a series of interviews highlighting the work of experts in the field of complex systems science. Dr. Ben Althouse, an Omidyar Fellow at the Santa Fe Institute, is a mathematical epidemiologist focusing on the dynamics of infectious disease transmission. Ben holds both an ScM in Biostatistics and a PhD in Epidemiology from the Johns Hopkins Bloomberg School of Public Health where he focused on understanding Dengue fever and other sylvatic mosquitoborne viruses (arboviruses) in Senegal using mechanistic modeling and the SIR model. Dr. Althouse also attended the Santa Fe Institute’s Complex Systems Summer School during his graduate studies.
Via Jorge Louçã
We introduce a nonpartisan probability distribution on congressional redistricting of North Carolina which emphasizes the equal partition of the population and the compactness of districts. When random districts are drawn and the results of the 2012 election were retabulated under the drawn districtings, we find that an average of 7.6 democratic representatives are elected. 95% of the randomly sampled redistrictings produced between 6 and 9 Democrats. Both of these facts are in stark contrast with the 4 Democrats elected in the 2012 elections with the same vote counts. This brings into serious question the idea that such elections represent the "will of the people." It underlines the ability of redistricting to undermine the democratic process, while on the face allowing democracy to proceed.
Network infrastructures, such as roads, pipelines or the power grid face a multitude of challenges, from organizational and use changes, to climate change and resource scarcity. These challenges require the adaptation of existing infrastructures or their complete new development. Traditionally, infrastructure planning and routing issues are solved through topdown optimization strategies such as mixed integer non linear programming or graph approaches, or through bottom up approaches such as particle swarm optimizations or ant colony optimizations. While some integrated approaches have been proposed int he literature, no direct comparison of the two approaches as applied to the same problem have been reported. Therefore, we implement two routing algorithms to connect a single source node to multiple consuming nodes in a topology with hard boundaries and nogo areas. We compare a geometric graph algorithm finding an (sub)optimum edgeweighted Steiner minimal tree with a Ant Colony Optimization algorithm implemented as an Agent Based Model. Experimenting with 100 randomly generated routing problems, we find that both algorithms perform surprisingly similar in terms of topology, cost and computational performance. We also discovered that by approaching the problem from both topdown and bottomup perspective, we were able to enrich both algorithms in a coevolutionary fashion. Our main findings are that the two algorithms, as currently implemented in our test environment hardly differ in the quality of solution and computational performance. There are however significant differences in ease of problem encoding and future extensibility.
This paper presents an idealized design for a legislative system. The concept of idealized design is explained. The paper critiques two critical (and often taken for granted) features of the legislative branches of most contemporary democratic governments: legislators are chosen by election, and the same bodies perform all legislative and metalegislative functions, for all laws. Seven problems with these two features are described. A new model of lawmaking is proposed, based on three concepts from ancient Athenian democracy — random selection, dividing legislative functions among multiple bodies, and the use of temporary bodies (like contemporary juries) for final decision making. The benefits of the model are laid out, and likely objections are addressed.
Forecasting epidemic outbreaks has long been the goal of health researchers. By modeling the interactions of two diseases occurring simultaneously, scientists show that specific parameters control the thresholds of epidemics.
Dynamics of Interacting Diseases Joaquín Sanz, ChengYi Xia, Sandro Meloni, and Yamir Moreno Phys. Rev. X 4, 041005 (2014) http://dx.doi.org/10.1103/PhysRevX.4.041005
Via Complexity Digest
Understanding, modeling, and predicting the impact of global change on ecosystem functioning across biogeographical gradients can benefit from enhanced capacity to represent biota as a continuous distribution of traits. However, this is a challenge for the field of biogeography historically grounded on the species concept. Here we focus on the newly emergent field of functional biogeography: the study of the geographic distribution of trait diversity across organizational levels. We show how functional biogeography bridges speciesbased biogeography and earth science to provide ideas and tools to help explain gradients in multifaceted diversity (including species, functional, and phylogenetic diversities), predict ecosystem functioning and services worldwide, and infuse regional and global conservation programs with a functional basis. Although much recent progress has been made possible because of the rising of multiple data streams, new developments in ecoinformatics, and new methodological advances, future directions should provide a theoretical and comprehensive framework for the scaling of biotic interactions across trophic levels and its ecological implications.
Teotihuacan was the first urban civilization of Mesoamerica and one of the largest of the ancient world. Following a tradition in archaeology to equate social complexity with centralized hierarchy, it is widely believed that the city’s origin and growth was controlled by a lineage of powerful individuals. However, much data is indicative of a government of corulers, and artistic traditions expressed an egalitarian ideology. Yet this alternative keeps being marginalized because the problems of collective action make it difficult to conceive how such a coalition could have functioned in principle. We therefore devised a mathematical model of the city’s hypothetical network of representatives as a formal proof of concept that widespread cooperation was realizable in a fully distributed manner. In the model, decisions become selforganized into globally optimal configurations even though local representatives behave and modify their relations in a rational and selfish manner. This selfoptimization crucially depends on occasional communal interruptions of normal activity, and it is impeded when sections of the network are too independent. We relate these insights to theories about communitywide rituals at Teotihuacan and the city’s eventual disintegration.
Proceedings from the 2014 Complex Systems Summer School are now posted, complete with a network map of the students’ collaborations. The students welcome comments and feedback. Included in the proceedings are an exemplary set of more than two dozen papers  more than half of which are being considered for publication. Some of the topics: Can simple models reproduce complex transportation networks? What are the nonlinear effects of pesticides on food dynamics? What role do fractals and scaling play in finance models?
Modelled as finite homogeneous Markov chains, probabilistic cellular automata with local transition probabilities in (0, 1) always posses a stationary distribution. This result alone is not very helpful when it comes to predicting the final configuration; one needs also a formula connecting the probabilities in the stationary distribution to some intrinsic feature of the lattice configuration. Previous results on the asynchronous cellular automata have showed that such feature really exists. It is the number of zeroone borders within the automaton's binary configuration. An exponential formula in the number of zeroone borders has been proved for the 1D, 2D and 3D asynchronous automata with neighborhood three, five and seven, respectively. We perform computer experiments on a synchronous cellular automaton to check whether the empirical distribution obeys also that theoretical formula. The numerical results indicate a perfect fit for neighbourhood three and five, which opens the way for a rigorous proof of the formula in this new, synchronous case.

The Lyapunov exponent characterizes an exponential growth rate of the difference of nearby orbits. A positive Lyapunov exponent is a manifestation of chaos. Here, we propose the Lyapunov pair, which is based on the generalized Lyapunov exponent, as a unified characterization of nonexponential and exponential dynamical instabilities in onedimensional maps. Chaos is classified into three different types, i.e., superexponential, exponential, and subexponential dynamical instabilities. Using onedimensional maps, we demonstrate superexponential and subexponential chaos and quantify the dynamical instabilities by the Lyapunov pair. In subexponential chaos, we show superweak chaos, which means that the growth of the difference of nearby orbits is slower than a stretched exponential growth. The scaling of the growth is analytically studied by a recently developed theory of a continuous accumulation process, which is related to infinite ergodic theory.
The last decade and a half has seen an ardent development of selforganised criticality (SOC), a new approach to complex systems, which has become important in many domains of natural as well as social science, such as geology, biology, astronomy, and economics, to mention just a few. This has led many to adopt a generalist stance towards SOC, which is now repeatedly claimed to be a universal theory of complex behaviour. The aim of this paper is twofold. First, I provide a brief and nontechnical introduction to SOC. Second, I critically discuss the various bold claims that have been made in connection with it. Throughout, I will adopt a rather sober attitude and argue that some people have been too readily carried away by fancy contentions. My overall conclusion will be that none of these bold claims can be maintained. Nevertheless, stripped of exaggerated expectations and daring assertions, many SOC models are interesting vehicles for promising scientific research.
Progress in science often begins with verbal hypotheses meant to explain why certain biological phenomena exist. An important purpose of mathematical models in evolutionary research, as in many other fields, is to act as “proofofconcept” tests of the logic in verbal explanations, paralleling the way in which empirical data are used to test hypotheses. Because not all subfields of biology use mathematics for this purpose, misunderstandings of the function of proofofconcept modeling are common. In the hope of facilitating communication, we discuss the role of proofofconcept modeling in evolutionary biology. Servedio MR, Brandvain Y, Dhole S, Fitzpatrick CL, Goldberg EE, et al. (2014) Not Just a Theory—The Utility of Mathematical Models in Evolutionary Biology. PLoS Biol 12(12): e1002017. http://dx.doi.org/10.1371/journal.pbio.1002017
Via Complexity Digest
The robustness of complex networks against node failure and malicious attack has been of interest for decades, while most of the research has focused on random attack or hubtargeted attack. In many realworld scenarios, however, attacks are neither random nor hubtargeted, but localized, where a group of neighboring nodes in a network are attacked and fail. In this paper we develop a percolation framework to analytically and numerically study the robustness of complex networks against such localized attack. In particular, we investigate this robustness in Erd\H{o}sR\'{e}nyi networks, randomregular networks, and scalefree networks. Our results provide insight into how to better protect networks, enhance cybersecurity, and facilitate the design of more robust infrastructures.
An increasing number of dissident voices claim that the standard neoDarwinian view of genes as 'leaders' and phenotypes as 'followers' during the process of adaptive evolution should be turned on its head. This idea is older than the rediscovery of Mendel's laws of inheritance and has been given several names before its final 'Baldwin effect' label. A condition for this effect is that environmentally induced variation such as phenotypic plasticity or learning is crucial for the initial establishment of a population. This gives the necessary time for natural selection to act on genetic variation and the adaptive trait can be eventually encoded in the genotype. An influential paper published in the late 1980s showed the Baldwin effect to happen in computer simulations, and claimed that it was crucial to solve a difficult adaptive task. This generated much excitement among scholars in various disciplines that regard neoDarwinian accounts to explain the evolutionary emergence of highorder phenotypic traits such as consciousness or language almost hopeless. Here, we use analytical and computational approaches to show that a standard population genetics treatment can easily crack what the scientific community has granted as an unsolvable adaptive problem without learning. The Baldwin effect is once again in need of convincing theoretical foundations. Phenotypic Plasticity, the Baldwin Effect, and the Speeding up of Evolution: the Computational Roots of an Illusion Mauro Santos, Eörs Szathmáry, José F. Fontanari http://arxiv.org/abs/1411.6843
Via Complexity Digest
In this paper, the authors continue to build on their proposed model for incorporating randomly selected citizens into the decisionmaking processes of government. The first article presented a case for the benefits of random selection; proposed a lawmaking process that replaces elected, allpurpose legislatures with multiple, limitedfunction bodies composed of randomly selected citizens; and identified possible objections to the model (see An Idealized Design for the Legislative Branch of Government, http://stwj.systemswiki.org/?p=140). In the current article, the authors extend the model to the executive branch, discussing how redesigning the executive branch could improve accountability to the legislature and to the people.The potentialfor current executive branch designs to negatively affect performance and accountability is used to propose a new model that reduces the power of the executive branch, increases accountability, and has the potential to reduce corruption. The benefits of the model are outlined, and possible objections are addressed.
We develop a general formalism for representing and understanding structure in complex systems. In our view, structure is the totality of relationships among a system's components, and these relationships can be quantified using information theory. In the interest of flexibility we allow information to be quantified using any function, including Shannon entropy and Kolmogorov complexity, that satisfies certain fundamental axioms. Using these axioms, we formalize the notion of a dependency among components, and show how a system's structure is revealed in the amount of information assigned to each dependency. We explore quantitative indices that summarize system structure, providing a new formal basis for the complexity profile and introducing a new index, the "marginal utility of information". Using simple examples, we show how these indices capture intuitive ideas about structure in a quantitative way. Our formalism also sheds light on a longstanding mystery: that the mutual information of three or more variables can be negative. We discuss applications to complex networks, gene regulation, the kinetic theory of fluids and multiscale cybernetic thermodynamics. An InformationTheoretic Formalism for Multiscale Structure in Complex Systems Benjamin Allen, Blake C. Stacey, Yaneer BarYam http://arxiv.org/abs/1409.4708
Via Complexity Digest
This paper is concerned with the limits of narrative understanding, and how they are thrown into relief by the challenge of emergent behaviour in complex systems. Such behaviour is a feature of much more of life than we tend to appreciate, but to recognize emergence is intrinsically to encounter the limits of narrative explanation. If we are not to be led astray by our cognitive dependence upon narrative, we need talk about emergent behaviour in a way that reaches beyond the limits of narrative sense; in discussions of emergence, sometimes even in definitions of emergence, this has tended to involve a vocabulary of surprise and wonder. I will examine the sources and implications of this vocabulary, and draw out its relation to the specific affordances of narrative sensemaking in general, and the functions of narrative perspective and inference in particular. The discussion takes off from attempts to define emergence in complexity science, but goes on to elaborate the argument by appeal to analogous cultural contexts including Christian iconography and belief, Hitchcock on the suspense thriller, and Don DeLillo’s White Noise; it engages with narratological discussions of omniscience and inference, as well as a larger philosophical perspective upon the nature of knowledge.
Malignant cancers that lead to fatal outcomes for patients may remain dormant for very long periods of time. Although individual mechanisms such as cellular dormancy, angiogenic dormancy and immunosurveillance have been proposed, a comprehensive understanding of cancer dormancy and the “switch” from a dormant to a proliferative state still needs to be strengthened from both a basic and clinical point of view. Computational modeling enables one to explore a variety of scenarios for possible but realistic microscopic dormancy mechanisms and their predicted outcomes. The aim of this paper is to devise such a predictive computational model of dormancy with an emergent “switch” behavior. Specifically, we generalize a previous cellular automaton (CA) model for proliferative growth of solid tumor that now incorporates a variety of celllevel tumorhost interactions and different mechanisms for tumor dormancy, for example the effects of the immune system. Our new CA rules induce a natural “competition” between the tumor and tumor suppression factors in the microenvironment. This competition either results in a “stalemate” for a period of time in which the tumor either eventually wins (spontaneously emerges) or is eradicated; or it leads to a situation in which the tumor is eradicated before such a “stalemate” could ever develop. We also predict that if the number of actively dividing cells within the proliferative rim of the tumor reaches a critical, yet low level, the dormant tumor has a high probability to resume rapid
Electrical communication between cardiomyocytes can be perturbed during arrhythmia, but these perturbations are not captured by conventional electrocardiographic metrics. In contrast, information theory metrics can quantify how arrhythmia impacts the sharing of information between individual cells. We developed a theoretical framework to quantify communication during normal and abnormal heart rhythms in two commonly used models of action potential propagation: a reaction diffusion model and a cellular automata model with realistic restitution properties. For both models, the tissue was simulated as a 2D cell lattice. The time series generated by each cell was coarsegrained to 1 when excited or 0 when resting. The Shannon entropy for each cell and the mutual information between each pair of cells were calculated from the time series during normal heartbeats, spiral wave, anatomical reentry, and multiple wavelets. We found that information sharing between cells was spatially heterogeneous on the simple lattice structure. In addition, arrhythmia significantly impacted information sharing within the heart. Mutual information could distinguish the spiral wave from multiple wavelets, which may help identify the mechanism of cardiac fibrillation in individual patients. Furthermore, entropy localized the path of the drifting core of the spiral wave, which could be an optimal target of therapeutic ablation. We conclude that information theory metrics can quantitatively assess electrical communication among cardiomyocytes. The traditional concept of the heart as a functional syncytium sharing electrical information via gap junctions cannot predict altered entropy and information sharing during complex arrhythmia. Information theory metrics may find clinical application in the identification of rhythmspecific treatments which are currently unmet by traditional electrocardiographic techniques.
We argue that a critical difference distinguishing machines from organisms and computers from brains is not complexity in a structural sense, but a difference in dynamical organization that is not well accounted for by current complexity measures. We propose a measure of the complexity of a system that is largely orthogonal to computational, information theoretic, or thermodynamic conceptions of structural complexity. What we call a system’s dynamical depth is a separate dimension of system complexity that measures the degree to which it exhibits discrete levels of nonlinear dynamical organization in which successive levels are distinguished by local entropy reduction and constraint generation. A system with greater dynamical depth than another consists of a greater number of such nested dynamical levels. Thus, a mechanical or linear thermodynamic system has less dynamical depth than an inorganic selforganized system, which has less dynamical depth than a living system. Including an assessment of dynamical depth can provide a more precise and systematic account of the fundamental difference between inorganic systems (low dynamical depth) and living systems (high dynamical depth), irrespective of the number of their parts and the causal relations between them.
