Your new post is loading...
Your new post is loading...
Building on Complex Adaptive Systems theory and basic Agent Based Modeling knowledge presented in SPM4530, the Advanced course will focus on the model development process. The students are expected to conceptualize, develop and verify a model during the course, individually or in a group. The modeling tasks will be, as much as possible, based on real life research problems, formulated by various research groups from within and outside the faculty. Study Goals The main goal of the course is to learn how to form a modeling question, perform a system decomposition, conceptualize and formalize the system elements, implement and verify the simulation and validate an Agent Based Model of a sociotechnical system.
Complejidady Economía's insight: Full Online Text (Dynamics of Complex Systems  NECSI  @scoopit http://t.co/sVePaP2sG2)
The Wikimedia Foundation has recently observed that newly joining editors on Wikipedia are increasingly failing to integrate into the Wikipedia editors' community, i.e. the community is becoming increasingly harder to penetrate. To sustain healthy growth of the community, the Wikimedia Foundation aims to quantitatively understand the factors that determine the editing behavior, and explain why most new editors become inactive soon after joining. As a step towards this broader goal, the Wikimedia foundation sponsored the ICDM (IEEE International Conference for Data Mining) contest for the year 2011. The objective for the participants was to develop models to predict the number of edits that an editor will make in future five months based on the editing history of the editor. Here we describe the approach we followed for developing predictive models towards this goal, the results that we obtained and the modeling insights that we gained from this exercise. In addition, towards the broader goal of Wikimedia Foundation, we also summarize the factors that emerged during our model building exercise as powerful predictors of future editing activity.
Diffusion of innovation can be interpreted as a social spreading phenomena governed by the impact of media and social interactions. Although these mechanisms have been identified by quantitative theories, their role and relative importance are not entirely understood, since empirical verification has so far been hindered by the lack of appropriate data. Here we analyse a dataset recording the spreading dynamics of the world's largest Voice over Internet Protocol service to empirically support the assumptions behind models of social contagion. We show that the probability of spontaneous service adoption is constant, the probability of adoption via social influence is linearly proportional to the fraction of adopting neighbours, and the probability of service termination is timeinvariant and independent of the behaviour of peers. By implementing the detected diffusion mechanisms into a dynamical agentbased model, we are able to emulate the adoption dynamics of the service in several countries worldwide. This approach enables us to make mediumterm predictions of service adoption and disclose dependencies between the dynamics of innovation spreading and the socioeconomic development of a country.
We discuss models and data of crowd disasters, crime, terrorism, war and disease spreading to show that conventional recipes, such as deterrence strategies, are often not effective and sufficient to contain them. Many common approaches do not provide a good picture of the actual system behavior, because they neglect feedback loops, instabilities and cascade effects. The complex and often counterintuitive behavior of social systems and their macrolevel collective dynamics can be better understood by means of complexity science. We highlight that a suitable system design and management can help to stop undesirable cascade effects and to enable favorable kinds of selforganization in the system. In such a way, complexity science can help to save human lives.
Power grids, road maps, and river streams are examples of infrastructural networks which are highly vulnerable to external perturbations. An abrupt local change of load (voltage, traffic density, or water level) might propagate in a cascading way and affect a significant fraction of the network. Almost discontinuous perturbations can be modeled by shock waves which can eventually interfere constructively and endanger the normal functionality of the infrastructure. We study their dynamics by solving the Burgers equation under random perturbations on several real and artificial directed graphs. Even for graphs with a narrow distribution of node properties (e.g., degree or betweenness), a steady state is reached exhibiting a heterogeneous load distribution, having a difference of one order of magnitude between the highest and average loads. Unexpectedly we find for the European power grid and for finite WattsStrogatz networks a broad pronounced bimodal distribution for the loads. To identify the most vulnerable nodes, we introduce the concept of nodebasin size, a purely topological property which we show to be strongly correlated to the average load of a node.
Via Shaolin Tan, NESS
In this paper, we introduce a new approach for modeling the human collective behaviors in the specific scenario of a sudden catastrophe, this catastrophe can be natural (i.e. earthquake, tsunami) or technological (nuclear event). The novelty of our work is to propose a mathematical model taking into account different concurrent behaviors in such situation and to include the processes of transition from one behavior to the other during the event. Thus, in this multidisciplinary research included mathematicians, computer scientists and geographers, we take into account the psychological reactions of the population in situations of disasters, and study their propagation mode. We propose a SIRbased model, where three types of collective reactions occur in catastrophe situations: reflex, panic and controlled behaviors. Moreover, we suppose that the interactions among these classes of population can be realized through imitation and emotional contagion processes. Some simulations will attest the relevance of the proposed model.
To appear in Theory and Practice of Logic Programming (TPLP). Dynamic systems play a central role in fields such as planning, verification, and databases. Fragmented throughout these fields, we find a multitude of languages to formally specify dynamic systems and a multitude of systems to reason on such specifications. Often, such systems are bound to one specific language and one specific inference task. It is troublesome that performing several inference tasks on the same knowledge requires translations of your specification to other languages. In this paper we study whether it is possible to perform a broad set of wellstudied inference tasks on one specification. More concretely, we extend IDP3 with several inferences from fields concerned with dynamic specifications.
An important class of economic models involve agents whose wealth changes due to transactions with other agents. Several authors have pointed out an analogy with kinetic theory, which describes molecules whose momentum and energy change due to interactions with other molecules. We pursue this analogy and derive a Boltzmann equation for the time evolution of the wealth distribution of a population of agents for the socalled YardSale Model of wealth exchange. We examine the solutions to this equation by a combination of analytical and numerical methods and investigate its longtime limit. We study an important limit of this equation for small transaction sizes and derive a partial integrodifferential equation governing the evolution of the wealth distribution in a closed economy. We then describe how this model can be extended to include features such as inflation, production, and taxation. In particular, we show that the model with taxation exhibits the basic features of the Pareto law, namely, a lower cutoff to the wealth density at sma
This paper presents a new air traffic complexity metric based on dynamical systems. Based on a set of radar observations (position and speed) a vector field interpolating these data is constructed. Once the field has been obtained, the Lyapunov spectrum of the associated dynamical system is computed on points evenly spaced on a spatial grid. The results of the computations are summarized on complexity maps, with high values indicating areas to avoid or to carefully monitor. A first approach based on linear dynamical system enable to compute an aggregate complexity metric. In order to produce complexity maps, two extensions of the previous approach have been developed (one in space and another in space and time). Finally, an approximation is proposed in order to localize the computation of the vector field by the mean of Local Linear Models.
This book is also about complexity science, which is an interdisciplinary field—at the intersection of mathematics, computer science and natural science—that focuses on discrete models of physical systems. In particular, it focuses on complex systems, which are systems with many interacting components. Complex systems include networks and graphs, cellular automata, agentbased models and swarms, fractals and selforganizing systems, chaotic systems and cybernetic systems.
The main aim of the 2014 Interdisciplinary Symposium on Complex Systems is to bring together researchers working on complex systems.
Individuals in groups, whether composed of humans or other animal species, often make important decisions collectively, including avoiding predators, selecting a direction in which to migrate and electing political leaders. Theoretical and empirical work suggests that collective decisions can be more accurate than individual decisions, a phenomenon known as the ‘wisdom of crowds’. In these previous studies, it has been assumed that individuals make independent estimates based on a single environmental cue. In the real world, however, most cues exhibit some spatial and temporal correlation, and consequently, the sensory information that near neighbours detect will also be, to some degree, correlated. Furthermore, it may be rare for an environment to contain only a single informative cue, with multiple cues being the norm. We demonstrate, using two simple models, that taking this natural complexity into account considerably alters the relationship between group size and decisionmaking accuracy. In only a minority of environments do we observe the typical wisdom of crowds phenomenon (whereby collective accuracy increases monotonically with group size). When the wisdom of crowds is not observed, we find that a finite, and often small, group size maximizes decision accuracy. We reveal that, counterintuitively, it is the noise inherent in these small groups that enhances their accuracy, allowing individuals in such groups to avoid the detrimental effects of correlated information while exploiting the benefits of collective decisionmaking. Our results demonstrate that the conventional view of the wisdom of crowds may not be informative in complex and realistic environments, and that being in small groups can maximize decision accuracy across many contexts.

The objective of CASSTING is to develop a novel approach for analysing and designing collective adaptive systems in their totality, by setting up a game theoretic framework. Here components are viewed as players, their behaviour is captured by strategies, system runs are plays, and specifications are winning conditions. We will develop formalisms for modelling collective adaptive systems as games, and algorithms for synthesising optimal strategies (and components).
3rd International Conference on Complex Dynamical Systems and Their Applications: New Mathematical Concepts and... http://t.co/Wd83nLP9Ik
We present a model that explores the influence of persuasion in a population of agents with positive and negative opinion orientations. The opinion of each agent is represented by an integer number k that expresses its level of agreement on a given issue, from totally against k=M to totally in favor k=M. Sameorientation agents persuade each other with probability p, becoming more extreme, while oppositeorientation agents become more moderate as they reach a compromise with probability q. The population initially evolves to (a) a polarized state for r=p/q>1, where opinions' distribution is peaked at the extreme values k=±M, or (b) a centralized state for r<1, with most opinions around k=±1. When r»1, polarization lasts for a time that diverges as rMlnN, where N is the population's size. Finally, an extremist consensus (k=M or M) is reached in a time that scales as r1 for r«1.
This course of 25 lectures, filmed at Cornell University in Spring 2014, is intended for newcomers to nonlinear dynamics and chaos. It closely follows Prof. Strogatz's book, "Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering." The mathematical treatment is friendly and informal, but still careful. Analytical methods, concrete examples, and geometric intuition are stressed. The theory is developed systematically, starting with firstorder differential equations and their bifurcations, followed by phase plane analysis, limit cycles and their bifurcations, and culminating with the Lorenz equations, chaos, iterated maps, period doubling, renormalization, fractals, and strange attractors. A unique feature of the course is its emphasis on applications. These include airplane wing vibrations, biological rhythms, insect outbreaks, chemical oscillators, chaotic waterwheels, and even a technique for using chaos to send secret messages. In each case, the scientific background is explained at an elementary level and closely integrated with the mathematical theory. The theoretical work is enlivened by frequent use of computer graphics, simulations, and videotaped demonstrations of nonlinear phenomena. The essential prerequisite is singlevariable calculus, including curve sketching, Taylor series, and separable differential equations. In a few places, multivariable calculus (partial derivatives, Jacobian matrix, divergence theorem) and linear algebra (eigenvalues and eigenvectors) are used. Fourier analysis is not assumed, and is developed where needed. Introductory physics is used throughout. Other scientific prerequisites would depend on the applications considered, but in all cases, a first course should be adequate preparation Nonlinear Dynamics and Chaos  Steven Strogatz, Cornell University https://www.youtube.com/playlist?list=PLbN57C5Zdl6j_qJApARJnKsmROzPnO9V
Via Complexity Digest
Introduced path connects the topics in common concept: integral information measure and symmetry. Initial path' sequence axiomatic probability distributions of stochastic multidimensional process transfers each priory to posteriori probabilities alternating probabilities over process trajectory. Emerging Bayesian probabilities entropy defines process uncertainty measure. Probability transitions model interactive random process generated by idealized virtual measurements of uncertainty as observable process of potential observer. When the measurements test uncertainty by interactive impulses its inferring certain posteriori probability starts converting uncertainty to certainty information. Observable uncertain impulse becomes certain control extracting maximum information from each observed minima and initiating information observer with internal process during conversion. Multiple trial actions produce observed frequency of the events measured probability actually occurred. Dual minimax principle of maxmim extraction and minimax consumption information is mathematical law whose variation equations determine observer structure and functionally unify regularities. Impulse controls cutoff the minimax, convert external process to internal information micro and macrodynamics through integral measuring, multiple trials, verification symmetry, cooperation, enfoldment in logical hierarchical information network IN and feedback path to observations; IN high level logic originates observer intelligence requesting new quality information. Functional regularities create integral logic selfoperating observations, inner dynamical and geometrical structures with boundary shaped by IN information geometry in timespace cooperative processes, and physical substances, observer cognition,intelligence. Logic holds invariance information and physical regularities of minimax law.
Two fundamental issues surrounding research on Zipf's law regarding city sizes are whether and why this law holds. This paper does not deal with the latter issue with respect to why, and instead investigates whether Zipf's law holds in a global setting, thus involving all cities around the world. Unlike previous studies, which have mainly relied on conventional census data, and census bureauimposed definitions of cities, we adopt naturally and objectively delineated cities, or natural cities, to be more precise, in order to examine Zipf's law. We find that Zipf's law holds remarkably well for all natural cities at the global level, and remains almost valid at the continental level except for Africa at certain time instants. We further examine the law at the country level, and note that Zipf's law is violated from country to country or from time to time. This violation is mainly due to our limitations; we are limited to individual countries, or to a static view on citysize distributions. The central argument of this paper is that Zipf's law is universal, and we therefore must use the correct scope in order to observe it.We further find Zipf's law applied to city numbers: the number of cities in individual countries follows an inverse power relationship; the number of cities in the first largest country is twice as many as that in the second largest country, three times as many as that in the third largest country, and so on.
Though biological and artificial complex systems having inhibitory connections exhibit high degree of clustering in their interaction pattern, the evolutionary origin of clustering in such systems remains a challenging problem. Using genetic algorithm we demonstrate that inhibition is required in the evolution of clique structure from primary random architecture, in which the fitness function is assigned based on the largest eigenvalue. Further, the distribution of triangles over nodes of the system evolved from mixed connections show negative correlation with its degree providing insight into origin of this trend observed in realistic interaction patterns.
In this paper a new evolutionary algorithm, for continuous nonlinear optimization problems, is surveyed. This method is inspired by the life of a bird, called Cuckoo. The Cuckoo Optimization Algorithm (COA) is evaluated by using the Rastrigin function. The problem is a nonlinear continuous function which is used for evaluating optimization algorithms. The efficiency of the COA has been studied by obtaining optimal solution of various dimensions Rastrigin function in this paper. The mentioned function also was solved by FA and ABC algorithms. Comparing the results shows the COA has better performance than other algorithms. Application of algorithm to test function has proven its capability to deal with difficult optimization problems.
Many complex systems can be represented as networks composed by distinct layers, interacting and depending on each others. For example, in biology, a good description of the full proteinprotein interactome requires, for some organisms, up to seven distinct network layers, with thousands of proteinprotein interactions each. A fundamental open question is then how much information is really necessary to accurately represent the structure of a multilayer complex system, and if and when some of the layers can indeed be aggregated. Here we introduce a method, based on information theory, to reduce the number of layers in multilayer networks, while minimizing information loss. We validate our approach on a set of synthetic benchmarks, and prove its applicability to an extended data set of proteingenetic interactions, showing cases where a strong reduction is possible and cases where it is not. Using this method we can describe complex systems with an optimal tradeoff between accuracy and complexity.
World Conference on Complex Systems (WCCS) will provide a highlevel, international forum for scientists, researchers, industrial professionals,...
Realworld entities often interconnect with each other through explicit or implicit relationships to form a complex network.
The fourth ICCSA will focus on recent advances in complex systems and applications in all fields of science and engineering.
