Your new post is loading...
Your new post is loading...
Understanding, modeling, and predicting the impact of global change on ecosystem functioning across biogeographical gradients can benefit from enhanced capacity to represent biota as a continuous distribution of traits. However, this is a challenge for the field of biogeography historically grounded on the species concept. Here we focus on the newly emergent field of functional biogeography: the study of the geographic distribution of trait diversity across organizational levels. We show how functional biogeography bridges speciesbased biogeography and earth science to provide ideas and tools to help explain gradients in multifaceted diversity (including species, functional, and phylogenetic diversities), predict ecosystem functioning and services worldwide, and infuse regional and global conservation programs with a functional basis. Although much recent progress has been made possible because of the rising of multiple data streams, new developments in ecoinformatics, and new methodological advances, future directions should provide a theoretical and comprehensive framework for the scaling of biotic interactions across trophic levels and its ecological implications.
Teotihuacan was the first urban civilization of Mesoamerica and one of the largest of the ancient world. Following a tradition in archaeology to equate social complexity with centralized hierarchy, it is widely believed that the city’s origin and growth was controlled by a lineage of powerful individuals. However, much data is indicative of a government of corulers, and artistic traditions expressed an egalitarian ideology. Yet this alternative keeps being marginalized because the problems of collective action make it difficult to conceive how such a coalition could have functioned in principle. We therefore devised a mathematical model of the city’s hypothetical network of representatives as a formal proof of concept that widespread cooperation was realizable in a fully distributed manner. In the model, decisions become selforganized into globally optimal configurations even though local representatives behave and modify their relations in a rational and selfish manner. This selfoptimization crucially depends on occasional communal interruptions of normal activity, and it is impeded when sections of the network are too independent. We relate these insights to theories about communitywide rituals at Teotihuacan and the city’s eventual disintegration.
Proceedings from the 2014 Complex Systems Summer School are now posted, complete with a network map of the students’ collaborations. The students welcome comments and feedback. Included in the proceedings are an exemplary set of more than two dozen papers  more than half of which are being considered for publication. Some of the topics: Can simple models reproduce complex transportation networks? What are the nonlinear effects of pesticides on food dynamics? What role do fractals and scaling play in finance models?
Modelled as finite homogeneous Markov chains, probabilistic cellular automata with local transition probabilities in (0, 1) always posses a stationary distribution. This result alone is not very helpful when it comes to predicting the final configuration; one needs also a formula connecting the probabilities in the stationary distribution to some intrinsic feature of the lattice configuration. Previous results on the asynchronous cellular automata have showed that such feature really exists. It is the number of zeroone borders within the automaton's binary configuration. An exponential formula in the number of zeroone borders has been proved for the 1D, 2D and 3D asynchronous automata with neighborhood three, five and seven, respectively. We perform computer experiments on a synchronous cellular automaton to check whether the empirical distribution obeys also that theoretical formula. The numerical results indicate a perfect fit for neighbourhood three and five, which opens the way for a rigorous proof of the formula in this new, synchronous case.
"In this review I show that four major kinds of theoretical approaches have been used to explain the scaling of metabolic rate in cells, organisms and groups of organisms in relation to system size. They include models focusing on surfacearea related fluxes of resources and wastes (including heat), internal resource transport, system composition, and various processes affecting resource demand, all of which have been discussed extensively for nearly a century or more. I argue that, although each of these theoretical approaches has been applied to multiple levels of biological organization, none of them alone can fully explain the rich diversity of metabolic scaling relationships, including scaling exponents (loglog slopes) that vary from ~0 to >1. Furthermore, I demonstrate how a synthetic theory of metabolic scaling can be constructed by including the contextdependent action of each of the above modal effects. This “contextual multimodal theory” (CMT) posits that various modulating factors (including metabolic level, surface permeability, body shape, modes of thermoregulation and resourcetransport, and other internal and external influences) affect the mechanistic expression of each theoretical module. By involving the contingent operation of several mechanisms, the “metamechanistic” CMT differs from most metabolic scaling theories that are deterministically mechanistic. The CMT embraces a systems view of life, and as such recognizes the open, dynamic nature and complex hierarchical and interactive organization of biological systems, and the importance of multiple (upward, downward and reciprocal) causation, biological regulation of resource supply and demand and their interaction, and contingent internal (system) and external (environmental) influences on metabolic scaling, all of which are discussed. I hope that my heuristic attempt at building a unifying theory of metabolic scaling will not only stimulate further testing of all of the various subtheories composing it, but also foster an appreciation that many current models are, at least in part, complementary or even synergistic, rather than antagonistic. Further exploration about how the scaling of the rates of metabolism and other biological processes are interrelated should also provide the groundwork for formulating a general metabolic theory of biology. "
The Santa Fe Institute (SFI) has launched a webbased educational platform, Complexity Explorer. SFI is a private research institute well known for its crossdisciplinary approach to complex systems such as ant colonies, biological cells, economies, and social systems. The stated mission of the institute is to “discover, comprehend, and communicate the common fundamental principles in complex physical, computational, biological, and social systems that underlie many of the most profound problems facing science and society today.” As part of the institute’s outreach mission, SFI’s Complexity Explorer offers free open online courses (“MOOCs”) as well as searchable repositories of educationrelated resources. Past SFI MOOCs have attracted over 20,000 enrollees from nearly 100 countries. This Fall SFI is offering three free MOOCS for people at different levels of expertise to learn about complex systems
We evaluated the education system of the United States from 1870 to 2011 using emergy methods. The system was partitioned into three subsystems (elementary, secondary and college/university education) and the emergy inputs required to support each subsystem were determined for every year over the period of analysis. We calculated the emergy required to produce an individual with a given number of years of education by summing over the years of support needed to attain that level of education. In 1983, the emergy per individual ranged from 8.63E+16 semj/ind. for a preschool student to 165.9E+16 semj/ind. for a Ph.D. with 2 years of postdoctoral experience. The emergy of teaching and learning per hour spent in this process was calculated as the sum of the emergy delivered by the education and experience of the teachers and the emergy brought to the process of learning by the students. The emergy of teaching and learning was about an order of magnitude larger than the annual emergy supporting the U.S. education system (i.e., the emergy inflows provided by the environment, energy and materials, teachers, entering students, goods and services). The implication is that teaching and learning is a higher order social process related to the development and maintenance of the national information cycle. Also, the results imply that there is a 10fold return on the emergy invested in operating the education system of the United States.
Being a Short Treatise on the Nature of Failure; How Failure is Evaluated; How Failure is Attributed to Proximate Cause; and the Resulting New Understanding of Patient Safety
Signal analysis is one of the finest scientific techniques in communication theory. Some quantitative and qualitative measures describe the pattern of a music signal, vary from one to another. Same musical recital, when played by different instrumentalists, generates different types of music patterns. The reason behind various patterns is the psychoacoustic measures  Dynamics, Timber, Tonality and Rhythm, varies in each time. However, the psychoacoustic study of the music signals does not reveal any idea about the similarity between the signals. For such cases, study of synchronization of longterm nonlinear dynamics may provide effective results. In this context, phase synchronization (PS) is one of the measures to show synchronization between two nonidentical signals. In fact, it is very critical to investigate any other kind of synchronization for experimental condition, because those are completely non identical signals. Also, there exists equivalence between the phases and the distances of the diagonal line in Recurrence plot (RP) of the signals, which is quantifiable by the recurrence quantification measure taurecurrence rate. This paper considers two nonlinear music signals based on same raga played by two eminent sitar instrumentalists as two nonidentical sources. The psychoacoustic study shows how the Dynamics, Timber, Tonality and Rhythm vary for the two music signals. Then, long term analysis in the form of phase space reconstruction is performed, which reveals the chaotic phase spaces for both the signals. From the RP of both the phase spaces, taurecurrence rate is calculated. Finally by the correlation of normalized taurecurrence rate of their 3D phase spaces and the PS of the two music signals has been established. The numerical results well support the analysis.
Measures of nonlinearity and complexity, and in particular the study of Lyapunov exponents, have been increasingly used to characterize dynamical properties of a wide range of biological nonlinear systems, including cardiovascular control. In this work, we present a novel methodology able to effectively estimate the Lyapunov spectrum of a series of stochastic events in an instantaneous fashion. The paradigm relies on a novel pointprocess highorder nonlinear model of the event series dynamics. The longterm information is taken into account by expanding the linear, quadratic, and cubic WienerVolterra kernels with the orthonormal Laguerre basis functions. Applications to synthetic data such as the Hénon map and Rössler attractor, as well as two experimental heartbeat interval datasets (i.e., healthy subjects undergoing postural changes and patients with severe cardiac heart failure), focus on estimation and tracking of the Instantaneous Dominant Lyapunov Exponent (IDLE). The novel cardiovascular assessment demonstrates that our method is able to effectively and instantaneously track the nonlinear autonomic control dynamics, allowing for complexity variability estimations.
Herding of sheep by dogs is a powerful example of one individual causing many unwilling individuals to move in the same direction. Similar phenomena are central to crowd control, cleaning the environment and other engineering problems. Despite single dogs solving this ‘shepherding problem’ every day, it remains unknown which algorithm they employ or whether a general algorithm exists for shepherding. Here, we demonstrate such an algorithm, based on adaptive switching between collecting the agents when they are too dispersed and driving them once they are aggregated. Our algorithm reproduces key features of empirical data collected from sheep–dog interactions and suggests new ways in which robots can be designed to influence movements of living and artificial agents.
Complexity science has proliferated across academic domains in recent years. A question arises as to whether any useful sense of ‘generalized complexity’ can be abstracted from the various versions of complexity to be found in the literature, and whether it could prove fruitful in a scientific sense. Most attempts at defining complexity center around two kinds of notions: Structural, and temporal or dynamic. Neither of these is able to provide a foundation for the intuitive or generalized notion when taken separately; structure is often a derivative notion, dependent on prior notions of complexity, and dynamic notions such as entropy are often indefinable. The philosophical notion of process may throw light on the tensions and contradictions within complexity. Robustness, for instance, a key quality of complexity, is quite naturally understood within a processtheoretical framework. Understanding complexity as process also helps one align complexity science with holistically oriented predecessors such as General System Theory, while allowing for the reductionist perspective of complexity. These results, however, have the further implication that it may be futile to search for general laws of complexity, or to hope that investigations of complex objects in one domain may throw light on complexity in unrelated domains.

Malignant cancers that lead to fatal outcomes for patients may remain dormant for very long periods of time. Although individual mechanisms such as cellular dormancy, angiogenic dormancy and immunosurveillance have been proposed, a comprehensive understanding of cancer dormancy and the “switch” from a dormant to a proliferative state still needs to be strengthened from both a basic and clinical point of view. Computational modeling enables one to explore a variety of scenarios for possible but realistic microscopic dormancy mechanisms and their predicted outcomes. The aim of this paper is to devise such a predictive computational model of dormancy with an emergent “switch” behavior. Specifically, we generalize a previous cellular automaton (CA) model for proliferative growth of solid tumor that now incorporates a variety of celllevel tumorhost interactions and different mechanisms for tumor dormancy, for example the effects of the immune system. Our new CA rules induce a natural “competition” between the tumor and tumor suppression factors in the microenvironment. This competition either results in a “stalemate” for a period of time in which the tumor either eventually wins (spontaneously emerges) or is eradicated; or it leads to a situation in which the tumor is eradicated before such a “stalemate” could ever develop. We also predict that if the number of actively dividing cells within the proliferative rim of the tumor reaches a critical, yet low level, the dormant tumor has a high probability to resume rapid
Electrical communication between cardiomyocytes can be perturbed during arrhythmia, but these perturbations are not captured by conventional electrocardiographic metrics. In contrast, information theory metrics can quantify how arrhythmia impacts the sharing of information between individual cells. We developed a theoretical framework to quantify communication during normal and abnormal heart rhythms in two commonly used models of action potential propagation: a reaction diffusion model and a cellular automata model with realistic restitution properties. For both models, the tissue was simulated as a 2D cell lattice. The time series generated by each cell was coarsegrained to 1 when excited or 0 when resting. The Shannon entropy for each cell and the mutual information between each pair of cells were calculated from the time series during normal heartbeats, spiral wave, anatomical reentry, and multiple wavelets. We found that information sharing between cells was spatially heterogeneous on the simple lattice structure. In addition, arrhythmia significantly impacted information sharing within the heart. Mutual information could distinguish the spiral wave from multiple wavelets, which may help identify the mechanism of cardiac fibrillation in individual patients. Furthermore, entropy localized the path of the drifting core of the spiral wave, which could be an optimal target of therapeutic ablation. We conclude that information theory metrics can quantitatively assess electrical communication among cardiomyocytes. The traditional concept of the heart as a functional syncytium sharing electrical information via gap junctions cannot predict altered entropy and information sharing during complex arrhythmia. Information theory metrics may find clinical application in the identification of rhythmspecific treatments which are currently unmet by traditional electrocardiographic techniques.
We argue that a critical difference distinguishing machines from organisms and computers from brains is not complexity in a structural sense, but a difference in dynamical organization that is not well accounted for by current complexity measures. We propose a measure of the complexity of a system that is largely orthogonal to computational, information theoretic, or thermodynamic conceptions of structural complexity. What we call a system’s dynamical depth is a separate dimension of system complexity that measures the degree to which it exhibits discrete levels of nonlinear dynamical organization in which successive levels are distinguished by local entropy reduction and constraint generation. A system with greater dynamical depth than another consists of a greater number of such nested dynamical levels. Thus, a mechanical or linear thermodynamic system has less dynamical depth than an inorganic selforganized system, which has less dynamical depth than a living system. Including an assessment of dynamical depth can provide a more precise and systematic account of the fundamental difference between inorganic systems (low dynamical depth) and living systems (high dynamical depth), irrespective of the number of their parts and the causal relations between them.
http://cssociety.org The purpose of the Society is to promote the development of all aspects of complex systems science in the countries of Europe, as well as the whole international scientific community.
Via Complexity Digest
Introduction to Complexity Course
Nature’s largescale patterns emerge from incomplete surveys, thanks to ideas borrowed from information theory.
Long term behavior of nonlinear deterministic continuous time signals can be studied in terms of their reconstructed attractors. Reconstructed attractors of a continuous signal are meant to be topologically equivalent representations of the dynamics of the unknown dynamical system which generates the signal. Sometimes, geometry of the attractor or its complexity may give important information on the system of interest. However, if the trajectories of the attractor behave as if they are not coming from continuous system or there exists many spike like structures on the path of the system trajectories, then there is no way to characterize the shape of the attractor. In this article, the traditional attractor reconstruction method is first used for two types of ECG signals: Normal healthy persons (NHP) and Congestive Heart failure patients (CHFP). As common in such a framework, the reconstructed attractors are not at all well formed and hence it is not possible to adequately characterize their geometrical features. Thus, we incorporate frequency domain information to the given time signals. This is done by transforming the signals to a time frequency domain by means of suitable Wavelet transforms (WT). The transformed signal concerns two non homogeneous variables and is still quite difficult to use to reconstruct some dynamics out of it. By applying a suitable mapping, this signal is further converted into integer domain and a new type of 3D plot, called integer lag plot, which characterizes and distinguishes the ECG signals of NHP and CHFP, is finally obtained.
Extreme events, a type of collective behavior in complex networked dynamical systems, often can have catastrophic consequences. To develop effective strategies to control extreme events is of fundamental importance and practical interest. Utilizing transportation dynamics on complex networks as a prototypical setting, we find that making the network “mobile” can effectively suppress extreme events. A striking, resonancelike phenomenon is uncovered, where an optimal degree of mobility exists for which the probability of extreme events is minimized. We derive an analytic theory to understand the mechanism of control at a detailed and quantitative level, and validate the theory numerically. Implications of our finding to current areas such as cybersecurity are discussed. Controlling extreme events on complex networks • YuZhong Chen, ZiGang Huang & YingCheng Lai Scientific Reports 4, Article number: 6121 http://dx.doi.org/10.1038/srep06121
Via Claudia Mihai, Complexity Digest
Complex systems are increasingly being viewed as distributed information processing systems, particularly in the domains of computational neuroscience, bioinformatics and Artificial Life. This trend has resulted in a strong uptake in the use of (Shannon) informationtheoretic measures to analyse the dynamics of complex systems in these fields. We introduce the Java Information Dynamics Toolkit (JIDT): a Google code project which provides a standalone, (GNU GPL v3 licensed) opensource code implementation for empirical estimation of informationtheoretic measures from timeseries data. While the toolkit provides classic informationtheoretic measures (e.g. entropy, mutual information, conditional mutual information), it ultimately focusses on implementing higherlevel measures for information dynamics. That is, JIDT focusses on quantifying information storage, transfer and modification, and the dynamics of these operations in space and time. For this purpose, it includes implementations of the transfer entropy and active information storage, their multivariate extensions and local or pointwise variants. JIDT provides implementations for both discrete and continuousvalued data for each measure, including various types of estimator for continuous data (e.g. Gaussian, boxkernel and KraskovStoegbauerGrassberger) which can be swapped at runtime due to Java's objectoriented polymorphism. Furthermore, while written in Java, the toolkit can be used directly in MATLAB, GNU Octave and Python. We present the principles behind the code design, and provide several examples to guide users "JIDT: An informationtheoretic toolkit for studying the dynamics of complex systems" Joseph T. Lizier, arXiv:1408.3270, 2014 http://arxiv.org/abs/1408.3270
Via Complexity Digest

Throughout my years playing around with fractals, the Sierpinski triangle has been a consistent staple. The triangle is named after Wacław Sierpiński and as fractals are wont the pattern appears in many places, so there are many different ways of constructing the triangle on a computer.
All of the methods are fundamentally iterative. The most obvious method is probably the triangleintriangle approach. We start with one triangle, and at every step we replace each triangle with 3 subtriangles: