Your new post is loading...
Your new post is loading...
The Schelling model is a simple agent based model that demonstrates how individuals' relocation decisions generate residential segregation in cities. Agents belong to one of two groups and occupy cells of rectangular space. Agents react to the fraction of agents of their own group within the neighborhood around their cell. Agents stay put when this fraction is above a given tolerance threshold but seek a new location if the fraction is below the threshold. The model is well known for its tipping point behavior: an initial random (integrated) pattern remains integrated when the tolerance threshold is below 1/3 but becomes segregated when the tolerance threshold is above 1/3. In this paper, we demonstrate that the variety of the Schelling model steady patterns is richer than the segregationintegration dichotomy and contains patterns that consist of segregated patches for each of the two groups alongside patches where both groups are spatially integrated. We obtain such patterns by considering a general version of the model in which the mechanisms of agents' interactions remain the same but the tolerance threshold varies between agents of both groups. We show that the model produces patterns of mixed integration and segregation when the tolerance threshold of most agents is either below the tipping point or above 2/3. In these cases, the mixed patterns are relatively insensitive to the model's parameters.
An insight borrowed from computer science suggests that evolution values both fitness and diversity.
Via Spaceweaver
Two great trends are evident in the evolution of life on Earth: towards increasing diversification and towards increasing integration. Diversification has spread living processes across the planet, progressively increasing the range of environments and free energy sources exploited by life. Integration has proceeded through a stepwise process in which living entities at one level are integrated into cooperative groups that become largerscale entities at the next level, and so on, producing cooperative organizations of increasing scale (for example, cooperative groups of simple cells gave rise to the more complex eukaryote cells, groups of these gave rise to multicellular organisms, and cooperative groups of these organisms produced animal societies). The trend towards increasing integration has continued during human evolution with the progressive increase in the scale of human groups and societies. The trends towards increasing diversification and integration are both driven by selection. An understanding of the trajectory and causal drivers of the trends suggests that they are likely to culminate in the emergence of a global entity. This entity would emerge from the integration of the living processes, matter, energy and technology of the planet into a global cooperative organization. Such an integration of the results of previous diversifications would enable the global entity to exploit the widest possible range of resources across the varied circumstances of the planet. This paper demonstrates that it's case for directionality meets the tests and criticisms that have proven fatal to previous claims for directionality in evolution.
The direction of evolution: The rise of cooperative organization John E. Stewart Biosystems Available online 1 June 2014 http://dx.doi.org/10.1016/j.biosystems.2014.05.006
Via Complexity Digest
Immersed in the world of Balinese water temples and cooperative farms, Anthropologist J. Stephen Lansing’s NSF funded research helped win UNESCO World Heritage Site status for Bali’s subaks.
Since their inception at Macy conferences in later 1940s complex systems remain the most controversial topic of interdisciplinary sciences. The term `complex system' is the most vague and liberally used scientific term. Using elementary cellular automata (ECA), and exploiting the CA classification, we demonstrate elusiveness of `complexity' by shifting spacetime dynamics of the automata from simple to complex by enriching cells with {\it memory}. This way, we can transform any ECA class to another ECA class  without changing skeleton of cellstate transition function  and vice versa by just selecting a right kind of memory. A systematic analysis display that memory helps `discover' hidden information and behaviour on trivial  uniform, periodic, and nontrivial  chaotic, complex  dynamical systems.

Suggested by
Samir

Workshop on Robustness, Adaptability and Critical Transitions in Living Systems. Submit your Abstract at http://seis.bristol.ac.uk/~fs13378/eccs_2014_livingsys.html Follow updates @SamirSuweis
The methodologies advocated in computational biology are in many cases proper systemlevel approaches. These methodologies are variously connected to the notion of “mesosystem” and thus on the focus on relational structures that are at the basis of biological regulation. Here, I describe how the formalization of biological systems by means of graph theory constitutes an extremely fruitful approach to biology. I suggest the epistemological relevance of the notion of graph resides in its multilevel character allowing for a natural “middleout” causation making largely obsolete the traditional opposition between “topdown” and “bottomup” styles of reasoning, so fulfilling the foundation dream of systems science of a direct link between systems analysis and the underlying physical reality.
The objective of CASSTING is to develop a novel approach for analysing and designing collective adaptive systems in their totality, by setting up a game theoretic framework. Here components are viewed as players, their behaviour is captured by strategies, system runs are plays, and specifications are winning conditions. We will develop formalisms for modelling collective adaptive systems as games, and algorithms for synthesising optimal strategies (and components).
3rd International Conference on Complex Dynamical Systems and Their Applications: New Mathematical Concepts and... http://t.co/Wd83nLP9Ik
We present a model that explores the influence of persuasion in a population of agents with positive and negative opinion orientations. The opinion of each agent is represented by an integer number k that expresses its level of agreement on a given issue, from totally against k=M to totally in favor k=M. Sameorientation agents persuade each other with probability p, becoming more extreme, while oppositeorientation agents become more moderate as they reach a compromise with probability q. The population initially evolves to (a) a polarized state for r=p/q>1, where opinions' distribution is peaked at the extreme values k=±M, or (b) a centralized state for r<1, with most opinions around k=±1. When r»1, polarization lasts for a time that diverges as rMlnN, where N is the population's size. Finally, an extremist consensus (k=M or M) is reached in a time that scales as r1 for r«1.
This course of 25 lectures, filmed at Cornell University in Spring 2014, is intended for newcomers to nonlinear dynamics and chaos. It closely follows Prof. Strogatz's book, "Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering." The mathematical treatment is friendly and informal, but still careful. Analytical methods, concrete examples, and geometric intuition are stressed. The theory is developed systematically, starting with firstorder differential equations and their bifurcations, followed by phase plane analysis, limit cycles and their bifurcations, and culminating with the Lorenz equations, chaos, iterated maps, period doubling, renormalization, fractals, and strange attractors. A unique feature of the course is its emphasis on applications. These include airplane wing vibrations, biological rhythms, insect outbreaks, chemical oscillators, chaotic waterwheels, and even a technique for using chaos to send secret messages. In each case, the scientific background is explained at an elementary level and closely integrated with the mathematical theory. The theoretical work is enlivened by frequent use of computer graphics, simulations, and videotaped demonstrations of nonlinear phenomena. The essential prerequisite is singlevariable calculus, including curve sketching, Taylor series, and separable differential equations. In a few places, multivariable calculus (partial derivatives, Jacobian matrix, divergence theorem) and linear algebra (eigenvalues and eigenvectors) are used. Fourier analysis is not assumed, and is developed where needed. Introductory physics is used throughout. Other scientific prerequisites would depend on the applications considered, but in all cases, a first course should be adequate preparation Nonlinear Dynamics and Chaos  Steven Strogatz, Cornell University https://www.youtube.com/playlist?list=PLbN57C5Zdl6j_qJApARJnKsmROzPnO9V
Via Complexity Digest
Introduced path connects the topics in common concept: integral information measure and symmetry. Initial path' sequence axiomatic probability distributions of stochastic multidimensional process transfers each priory to posteriori probabilities alternating probabilities over process trajectory. Emerging Bayesian probabilities entropy defines process uncertainty measure. Probability transitions model interactive random process generated by idealized virtual measurements of uncertainty as observable process of potential observer. When the measurements test uncertainty by interactive impulses its inferring certain posteriori probability starts converting uncertainty to certainty information. Observable uncertain impulse becomes certain control extracting maximum information from each observed minima and initiating information observer with internal process during conversion. Multiple trial actions produce observed frequency of the events measured probability actually occurred. Dual minimax principle of maxmim extraction and minimax consumption information is mathematical law whose variation equations determine observer structure and functionally unify regularities. Impulse controls cutoff the minimax, convert external process to internal information micro and macrodynamics through integral measuring, multiple trials, verification symmetry, cooperation, enfoldment in logical hierarchical information network IN and feedback path to observations; IN high level logic originates observer intelligence requesting new quality information. Functional regularities create integral logic selfoperating observations, inner dynamical and geometrical structures with boundary shaped by IN information geometry in timespace cooperative processes, and physical substances, observer cognition,intelligence. Logic holds invariance information and physical regularities of minimax law.
Two fundamental issues surrounding research on Zipf's law regarding city sizes are whether and why this law holds. This paper does not deal with the latter issue with respect to why, and instead investigates whether Zipf's law holds in a global setting, thus involving all cities around the world. Unlike previous studies, which have mainly relied on conventional census data, and census bureauimposed definitions of cities, we adopt naturally and objectively delineated cities, or natural cities, to be more precise, in order to examine Zipf's law. We find that Zipf's law holds remarkably well for all natural cities at the global level, and remains almost valid at the continental level except for Africa at certain time instants. We further examine the law at the country level, and note that Zipf's law is violated from country to country or from time to time. This violation is mainly due to our limitations; we are limited to individual countries, or to a static view on citysize distributions. The central argument of this paper is that Zipf's law is universal, and we therefore must use the correct scope in order to observe it.We further find Zipf's law applied to city numbers: the number of cities in individual countries follows an inverse power relationship; the number of cities in the first largest country is twice as many as that in the second largest country, three times as many as that in the third largest country, and so on.

When it comes to cities, being big and rich is better for the planet than being big and poor, according to a new study of carbon dioxide emissions from cities around the world. But is this correct?
Via Claudia Mihai, Roger D. Jones, PhD
Civil unrest is a powerful form of collective human dynamics, which has led to major transitions of societies in modern history. The study of collective human dynamics, including collective aggression, has been the focus of much discussion in the context of modeling and identification of universal patterns of behavior. In contrast, the possibility that civil unrest activities, across countries and over long time periods, are governed by universal mechanisms has not been explored. Here, records of civil unrest of 170 countries during the period 1919–2008 are analyzed. It is demonstrated that the distributions of the number of unrest events per year are robustly reproduced by a nonlinear, spatially extended dynamical model, which reflects the spread of civil disorder between geographic regions connected through social and communication networks. The results also expose the similarity between global social instability and the dynamics of natural hazards and epidemics.
In 2011, the wrath of the 99% kindled Occupy movements around the world. The protests petered out, but in their wake an international conversation about inequality has arisen, with tens of thousands of speeches, articles, and blogs engaging everyone from President Barack Obama on down. Ideology and emotion drive much of the debate. But increasingly, the discussion is sustained by a tide of new data on the gulf between rich and poor. This special issue uses these fresh waves of data to explore the origins, impact, and future of inequality around the world. Archaeological and ethnographic data are revealing how inequality got its start in our ancestors (see pp. 822 and 824). New surveys of emerging economies offer more reliable estimates of people's incomes and how they change as countries develop (see p. 832). And in the past decade in developed capitalist nations, intensive effort and interdisciplinary collaborations have produced large data sets, including the compilation of a century of income data and two centuries of wealth data into the World Top Incomes Database (WTID) (see p. 826 and Piketty and Saez, p. 838). Science 23 May 2014: Vol. 344 no. 6186 pp. 818821 DOI: 10.1126/science.344.6186.818
Via NESS
I was intrigued when Carl Woese told me his collaboration with University of Illinois physicist Nigel Goldenfeld was the most productive one of his entire career, and was pleased to finally run into Goldenfeld last September at lunch in the courtyard...
Mark Newman May 2, 2014 Annual Science Board Symposium and Meeting Complexity: Theory and Practice
Textbook for seminar/course on complex systems. View full text in PDF format The study of complex systems in a unified framework has become recognized in recent years as a new scientific discipline, the ultimate of interdisciplinary fields. Breaking down the barriers between physics, chemistry and biology and the socalled soft sciences of psychology, sociology, economics, and anthropology, this text explores the universal physical and mathematical principles that govern the emergence of complex systems from simple components.
Building on Complex Adaptive Systems theory and basic Agent Based Modeling knowledge presented in SPM4530, the Advanced course will focus on the model development process. The students are expected to conceptualize, develop and verify a model during the course, individually or in a group. The modeling tasks will be, as much as possible, based on real life research problems, formulated by various research groups from within and outside the faculty. Study Goals The main goal of the course is to learn how to form a modeling question, perform a system decomposition, conceptualize and formalize the system elements, implement and verify the simulation and validate an Agent Based Model of a sociotechnical system.
Complejidady Economía's insight: Full Online Text (Dynamics of Complex Systems  NECSI  @scoopit http://t.co/sVePaP2sG2)
The Wikimedia Foundation has recently observed that newly joining editors on Wikipedia are increasingly failing to integrate into the Wikipedia editors' community, i.e. the community is becoming increasingly harder to penetrate. To sustain healthy growth of the community, the Wikimedia Foundation aims to quantitatively understand the factors that determine the editing behavior, and explain why most new editors become inactive soon after joining. As a step towards this broader goal, the Wikimedia foundation sponsored the ICDM (IEEE International Conference for Data Mining) contest for the year 2011. The objective for the participants was to develop models to predict the number of edits that an editor will make in future five months based on the editing history of the editor. Here we describe the approach we followed for developing predictive models towards this goal, the results that we obtained and the modeling insights that we gained from this exercise. In addition, towards the broader goal of Wikimedia Foundation, we also summarize the factors that emerged during our model building exercise as powerful predictors of future editing activity.
Diffusion of innovation can be interpreted as a social spreading phenomena governed by the impact of media and social interactions. Although these mechanisms have been identified by quantitative theories, their role and relative importance are not entirely understood, since empirical verification has so far been hindered by the lack of appropriate data. Here we analyse a dataset recording the spreading dynamics of the world's largest Voice over Internet Protocol service to empirically support the assumptions behind models of social contagion. We show that the probability of spontaneous service adoption is constant, the probability of adoption via social influence is linearly proportional to the fraction of adopting neighbours, and the probability of service termination is timeinvariant and independent of the behaviour of peers. By implementing the detected diffusion mechanisms into a dynamical agentbased model, we are able to emulate the adoption dynamics of the service in several countries worldwide. This approach enables us to make mediumterm predictions of service adoption and disclose dependencies between the dynamics of innovation spreading and the socioeconomic development of a country.
We discuss models and data of crowd disasters, crime, terrorism, war and disease spreading to show that conventional recipes, such as deterrence strategies, are often not effective and sufficient to contain them. Many common approaches do not provide a good picture of the actual system behavior, because they neglect feedback loops, instabilities and cascade effects. The complex and often counterintuitive behavior of social systems and their macrolevel collective dynamics can be better understood by means of complexity science. We highlight that a suitable system design and management can help to stop undesirable cascade effects and to enable favorable kinds of selforganization in the system. In such a way, complexity science can help to save human lives.
Power grids, road maps, and river streams are examples of infrastructural networks which are highly vulnerable to external perturbations. An abrupt local change of load (voltage, traffic density, or water level) might propagate in a cascading way and affect a significant fraction of the network. Almost discontinuous perturbations can be modeled by shock waves which can eventually interfere constructively and endanger the normal functionality of the infrastructure. We study their dynamics by solving the Burgers equation under random perturbations on several real and artificial directed graphs. Even for graphs with a narrow distribution of node properties (e.g., degree or betweenness), a steady state is reached exhibiting a heterogeneous load distribution, having a difference of one order of magnitude between the highest and average loads. Unexpectedly we find for the European power grid and for finite WattsStrogatz networks a broad pronounced bimodal distribution for the loads. To identify the most vulnerable nodes, we introduce the concept of nodebasin size, a purely topological property which we show to be strongly correlated to the average load of a node.
Via Shaolin Tan, NESS
