Your new post is loading...
Your new post is loading...
Cooperative game theory deals with situations where objectives of participants of the game are partially cooperative and partially conflicting. It is in the interest of participants to cooperate in the sense of making binding agreements to achieve the maximum possible benefit. When it comes to distribution of benefit/payoffs, participants have conflicting interests. Such situations are usually modelled as cooperative games. While the book mainly discusses transferable utility games, there is also a brief analysis of nontransferable utility games. Alternative solution concepts to cooperative game theoretic problems are presented in chapters 19 and the next four chapters present issues related to computations of solutions discussed in the earlier chapters. The proofs of all results presented in the book are quite explicit. Additionally the mathematical techniques employed in demonstrating the results will be helpful to those who wish to learn application of mathematics for solving problems in game theory.
Via Complexity Digest
How do shared conventions emerge in complex decentralized social systems? This question engages fields as diverse as linguistics, sociology, and cognitive science. Previous empirical attempts to solve this puzzle all presuppose that formal or informal institutions, such as incentives for global agreement, coordinated leadership, or aggregated information about the population, are needed to facilitate a solution. Evolutionary theories of social conventions, by contrast, hypothesize that such institutions are not necessary in order for social conventions to form. However, empirical tests of this hypothesis have been hindered by the difficulties of evaluating the realtime creation of new collective behaviors in large decentralized populations. Here, we present experimental results—replicated at several scales—that demonstrate the spontaneous creation of universally adopted social conventions and show how simple changes in a population’s network structure can direct the dynamics of norm formation, driving human populations with no ambition for large scale coordination to rapidly evolve shared social conventions.
Via Ashish Umre
Contemporary complexity theory has been instrumental in providing novel rigorous definitions for some classic philosophical concepts, including emergence. In an attempt to provide an account of emergence that is consistent with complexity and dynamical systems theory, several authors have turned to the notion of constraints on state transitions. Drawing on complexity theory directly, this paper builds on those accounts, further developing the constraintbased interpretation of emergence and arguing that such accounts recover many of the features of more traditional accounts. We show that the constraintbased account of emergence also leads naturally into a meaningful definition of selforganization, another concept that has received increasing attention recently. Along the way, we distinguish between order and organization, two concepts which are frequently conflated. Finally, we consider possibilities for future research in the philosophy of complex systems, as well as applications of the distinctions made in this paper. SelfOrganization, Emergence, and Constraint in Complex Natural Systems Jonathan Lawhead http://arxiv.org/abs/1502.01476
Via Complexity Digest
We analyze the replicatormutator equations for the RockPaperScissors game. Various graphtheoretic patterns of mutation are considered, ranging from a single unidirectional mutation pathway between two of the species, to global bidirectional mutation among all the species. Our main result is that the coexistence state, in which all three species exist in equilibrium, can be destabilized by arbitrarily small mutation rates. After it loses stability, the coexistence state gives birth to a stable limit cycle solution created in a supercritical Hopf bifurcation. This attracting periodic solution exists for all the mutation patterns considered, and persists arbitrarily close to the limit of zero mutation rate and a zerosum game. Nonlinear Dynamics of the RockPaperScissors Game with Mutations Danielle F. P. Toupo, Steven H. Strogatz http://arxiv.org/abs/1502.03370
Via Complexity Digest
All systems in nature have one thing in common: they process information. Information is registered in the state of a system and its elements, implicitly and invisibly. As elements interact, information is transferred and modified. Indeed, bits of information about the state of one element will travel—imperfectly—to the state of the other element, forming its new state. This storage, transfer, and modification of information, possibly between levels of a multi level system, is imperfect due to randomness or noise. From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions. Mapping out exactly how these bits of information percolate through the system could reveal new fundamental insights in how the parts orchestrate to produce the properties of the system. A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with disparate semantics, would reduce to a single language of information processing. Guest Editor: Dr. Rick Quax Deadline for manuscript submissions: 28 February 2015
Via Complexity Digest, Alejandro J. Alvarez S.
The first network analysis of the entire body of European Community legislation reveals the pattern of links between laws and their resilience to change.
Reconstructing Reality: Models, Mathematics, and Simulations (Oxford Studies in Philosophy of Science) [Margaret Morrison] on Amazon.com. *FREE* shipping on qualifying offers. Attempts to understand various aspects of the empirical world often rely on modelling processes that involve a reconstruction of systems under investigation. Typically the reconstruction uses mathematical frameworks like gauge theory and renormalization group methods
An important question in the debate over embodied, enactive, and extended cognition has been what has been meant by “cognition”. What is this cognition that is supposed to be embodied, enactive, or extended? Rather than undertake a frontal assault on this question, however, this paper will take a different approach. In particular, we may ask how cognition is supposed to be related to behavior. First, we could ask whether cognition is supposed to be (a type of) behavior. Second, we could ask whether we should attempt to understand cognitive processes in terms of antecedently understood cognitive behaviors. This paper will survey some of the answers that have been (implicitly or explicitly) given in the embodied, enactive, and extended cognition literature, then suggest reasons to believe that we should answer both questions in the negative. Cognition and behavior Ken Aizawa Synthese January 2015 http://dx.doi.org/10.1007/s1122901406455 ;
Via Complexity Digest, NESS
The spatial distribution of people exhibits clustering across a wide range of scales, from household to continental scales. Empirical data indicates simple powerlaw scalings for the size distribution of cities (known as Zipf's law), the geographic distribution of friends, and the population density fluctuations as a function of scale. We derive a simple statistical model that explains all of these scaling laws based on a single unifying principle involving the random spatial growth of clusters of people on all scales. The model makes important new predictions for the spread of diseases and other social phenomena. A Unifying Theory for Scaling Laws of Human Populations Henry W. Lin, Abraham Loeb http://arxiv.org/abs/1501.00738
Via Complexity Digest, NESS
Understanding human socioeconomic development has proven to be one of the most difficult and persistent problems in science and policy. Traditional policy has often attempted to promote human development through infrastructure and the delivery of services, but the link between these engineered systems and the complexity of human socioeconomic behavior remains poorly understood. Recent research suggests that the key to socioeconomic progress lies in the development of processes whereby new information is created by individuals and organizations and embedded in the structure of social networks at a diverse set of scales, from nations to cities to firms. Here, we formalize these ideas in terms of network theory—namely the spatial network of mobile phone communications in Côte d’Ivoireto show how incipient socioeconomic connectivity may constitute a general obstacle to development. Inspired by recent progress in the theory of cities as complex systems, we then propose a set of tests for these theories using telecommunications network data and describe how telecommunication services may generally help promote socioeconomic development.
Here we study the emergence of spontaneous leadership in large populations. In standard models of opinion dynamics, herding behavior is only obeyed at the local scale due to the interaction of single agents with their neighbors; while at the global scale, such models are governed by purely diffusive processes. Surprisingly, in this paper we show that the combination of a strong separation of time scales within the population and a hierarchical organization of the influences of some agents on the others induces a phase transition between a purely diffusive phase, as in the standard case, and a herding phase where a fraction of the agents selforganize and lead the global opinion of the whole population. Follow the Leader: Herding Behavior in Heterogeneous Populations Guillem MosqueraDonate, Marian Boguna http://arxiv.org/abs/1412.7427
Via Complexity Digest
The Lyapunov exponent characterizes an exponential growth rate of the difference of nearby orbits. A positive Lyapunov exponent is a manifestation of chaos. Here, we propose the Lyapunov pair, which is based on the generalized Lyapunov exponent, as a unified characterization of nonexponential and exponential dynamical instabilities in onedimensional maps. Chaos is classified into three different types, i.e., superexponential, exponential, and subexponential dynamical instabilities. Using onedimensional maps, we demonstrate superexponential and subexponential chaos and quantify the dynamical instabilities by the Lyapunov pair. In subexponential chaos, we show superweak chaos, which means that the growth of the difference of nearby orbits is slower than a stretched exponential growth. The scaling of the growth is analytically studied by a recently developed theory of a continuous accumulation process, which is related to infinite ergodic theory.
Via Bernard Ryefield

The interactions between Computer Science and the Social Sciences have grown fruitfully along the past 20 years. The mutual benefits of such a crossfertilization stand as well at a conceptual, technological or methodological level. Economics in particular benefited from innovations in multiagent systems in Computer Science leading to agentbased computational economics and in return the multiagent systems benefited for instance of economic researches related to mechanisms of incentives and regulation to design selforganized systems. Created 10 years ago, in 2005 in Lille (France) by Philippe Matthieu and his team, the Artificial Economics conference series reveals the liveliness of the collaborations and exchanges among computer scientists and economists in particular. The excellent quality of this conference has been recognized since its inception and its proceedings have been regularly published in Springer’s Lecture Notes in Economics and Mathematical Systems series. At about the same period, the European Social Simulation Association was created and decided to support an annual conference dedicated to computational approaches of the social sciences. Both communities kept going alongside for the past ten years presenting evident overlaps concerning either their approaches or their members. This year, both conferences have decided to join their efforts and hold a common conference, Social Simulation Conference, in Barcelona, Spain, 1st to 5th September 2014 which will host the 10th edition of the Artificial Economics Conference. In this edition, 32 submissions from 11 countries were received, from which we selected 20 for presentation (near 60 % acceptance). The papers have then been revised and extended and 19 papers were selected in order to make part of this volume.
Via Jorge Louçã, NESS
The question What is Complexity? has occupied a great deal of time and paper over the last 20 or so years. There are a myriad different perspectives and definitions but still no consensus. In this paper I take a phenomenological approach, identifying several factors that discriminate well between systems that would be consensually agreed to be simple versus others that would be consensually agreed to be complex  biological systems and human languages. I argue that a crucial component is that of structural building block hierarchies that, in the case of complex systems, correspond also to a functional hierarchy. I argue that complexity is an emergent property of this structural/functional hierarchy, induced by a property  fitness in the case of biological systems and meaning in the case of languages  that links the elements of this hierarchy across multiple scales. Additionally, I argue that noncomplex systems "are" while complex systems "do" so that the latter, in distinction to physical systems, must be described not only in a space of states but also in a space of update rules (strategies) which we do not know how to specify. Further, the existence of structural/functional building block hierarchies allows for the functional specialisation of structural modules as amply observed in nature. Finally, we argue that there is at least one measuring apparatus capable of measuring complexity as characterised in the paper  the human brain itself. What Isn't Complexity? Christopher R. Stephens http://arxiv.org/abs/1502.03199
Via Complexity Digest
Zipf's law is just one out of many universal laws proposed to describe statistical regularities in language. Here we review and critically discuss how these laws can be statistically interpreted, fitted, and tested (falsified). The modern availability of large databases of written text allows for tests with an unprecedent statistical accuracy and also a characterization of the fluctuations around the typical behavior. We find that fluctuations are usually much larger than expected based on simplifying statistical assumptions (e.g., independence and lack of correlations between observations).These simplifications appear also in usual statistical tests so that the large fluctuations can be erroneously interpreted as a falsification of the law. Instead, here we argue that linguistic laws are only meaningful (falsifiable) if accompanied by a model for which the fluctuations can be computed (e.g., a generative model of the text). The large fluctuations we report show that the constraints imposed by linguistic laws on the creativity process of text generation are not as tight as one could expect. Statistical laws in linguistics Eduardo G. Altmann, Martin Gerlach http://arxiv.org/abs/1502.03296
Via Complexity Digest
To maintain stability yet retain the flexibility to adapt to changing circumstances, social systems must strike a balance between the maintenance of a shared reality and the survival of minority opinion. A computational model is presented that investigates the interplay of two basic, oppositional social processes—conformity and anticonformity—in promoting the emergence of this balance. Computer simulations employing a cellular automata platform tested hypotheses concerning the survival of minority opinion and the maintenance of system stability for different proportions of anticonformity. Results revealed that a relatively small proportion of anticonformists facilitated the survival of a minority opinion held by a larger number of conformists who would otherwise succumb to pressures for social consensus. Beyond a critical threshold, however, increased proportions of anticonformists undermined social stability. Understanding the adaptive benefits of balanced oppositional forces has implications for optimal functioning in psychological and social processes in general. The Critical Few: Anticonformists at the Crossroads of Minority Opinion Survival and Collapse by Matthew Jarman, Andrzej Nowak, Wojciech Borkowski, David Serfass, Alexander Wong and Robin Vallacher http://jasss.soc.surrey.ac.uk/18/1/6.html
Via Complexity Digest
During the 1960s but mainly in the 1970s, large mathematical dynamic global models were implemented in computers to simulate the entire world, or large portions of it. Several different but interrelated subjects were considered simultaneously, and their variables evolved over time in an attempt to forecast the future, considering decades as time horizons. Global models continued to be developed while evidencing an increasing bias towards environmental aspects, or at least the public impact of models with such a focus became prevalent. In this paper we analyze the early evolution of computerbased global modeling and provide insights on less known pioneering works by South American modelers in the 1960s (Varsavsky and collaborators). We revisit relevant methodological aspects and discuss how they influenced different modeling endeavors. Finally, we overview how distinctive systemic approaches in global modeling evolved into the currently wellestablished discipline of complex systems.
Via Bernard Ryefield
(2003). Organizations as selforganizing and sustaining systems: a complex and autopoietic systems perspective. International Journal of General Systems: Vol. 32, No. 5, pp. 459474. doi: 10.1080/0308107031000135017
Aha..... That is Interesting!: John Holland, 85 Years Young (Exploring Complexity) [Jan W Vasbinder] on Amazon.com. *FREE* shipping on qualifying offers. John Holland is one of the few scientists, who all by themselves and by their pursuits, helped change the course of science and the wealth of human knowledge. There is hardly a field of science or problems
All these examples tell the same story: that the world contains an unimaginably vast amount of digital information which is getting ever vaster ever more rapidly. This makes it possible to do many things that previously could not be done: spot business trends, prevent diseases, combat crime and so on. Managed well, the data can be used to unlock new sources of economic value, provide fresh insights into science and hold governments to account.
Via Complexity Institute
We introduce a new conception of community structure, which we refer to as hidden community structure. Hidden community structure refers to a specific type of overlapping community structure, in which the detection of weak, but meaningful, communities is hindered by the presence of stronger communities. We present Hidden Community Detection HICODE, an algorithm template that identifies both the strong, dominant community structure as well as the weaker, hidden community structure in networks. HICODE begins by first applying an existing community detection algorithm to a network, and then removing the structure of the detected communities from the network. In this way, the structure of the weaker communities becomes visible. Through application of HICODE, we demonstrate that a wide variety of real networks from different domains contain many communities that, though meaningful, are not detected by any of the popular community detection algorithms that we consider. Additionally, on both real and synthetic networks containing a hidden groundtruth community structure, HICODE uncovers this structure better than any baseline algorithms that we compared against. For example, on a real network of undergraduate students that can be partitioned either by `Dorm' (residence hall) or `Year', we see that HICODE uncovers the weaker `Year' communities with a JCRecall score (a recallbased metric that we define in the text) of over 0.7, while the baseline algorithms achieve scores below 0.2.
Via Ashish Umre
Abstract We give exact statistical distributions for the dynamic response of influence networks subjected to external perturbations. We consider networks whose nodes have two internal states labeled 0 and 1. We let N0 nodes be frozen in state 0, N1 in state 1, and the remaining nodes change by adopting the state of a connected node with a fixed probability per time step. The frozen nodes can be interpreted as external perturbations to the subnetwork of free nodes. Analytically extending N0 and N1 to be smaller than 1 enables modeling the case of weak coupling. We solve the dynamical equations exactly for fully connected networks, obtaining the equilibrium distribution, transition probabilities between any two states and the characteristic time to equilibration. Our exact results are excellent approximations for other topologies, including random, regular lattice, scalefree and small world networks, when the numbers of fixed nodes are adjusted to take account of the effect of topology on coupling to the environment. This model can describe a variety of complex systems, from magnetic spins to social networks to population genetics, and was recently applied as a framework for early warning signals for realworld selforganized economic market crises.
The goal of this thematic series is to provide a discussion venue about recent advances in the study of networks and their applications to the study of collective behavior in sociotechnical systems. The series includes contributions exploring the intersection between datadriven studies of complex networks and agentbased models of collective social behavior. Particular attention is devoted to topics aimed at understanding social behavior through the lens of data about technologymediated communication. These include: modeling social dynamics of attention and collaboration, characterizing online group formation and evolution, and studying the emergence of roles and interaction patterns in social media environments. Collective behaviors and networks Giovanni Luca Ciampaglia, Emilio Ferrara and Alessandro Flammini EPJ Data Science 2014, 3:37 http://dx.doi.org/10.1140/epjds/s1368801400376
Via Complexity Digest
We consider biological individuality in terms of information theoretic and graphical principles. Our purpose is to extract through an algorithmic decomposition systemenvironment boundaries supporting individuality. We infer or detect evolved individuals rather than assume that they exist. Given a set of consistent measurements over time, we discover a coarsegrained or quantized description on a system, inducing partitions (which can be nested). Legitimate individual partitions will propagate information from the past into the future, whereas spurious aggregations will not. Individuals are therefore defined in terms of ongoing, bounded information processing units rather than lists of static features or conventional replicationbased definitions which tend to fail in the case of cultural change. One virtue of this approach is that it could expand the scope of what we consider adaptive or biological phenomena, particularly in the microscopic and macroscopic regimes of molecular and social phenomena. The Information Theory of Individuality David Krakauer, Nils Bertschinger, Eckehard Olbrich, Nihat Ay, Jessica C. Flack http://arxiv.org/abs/1412.2447
Via Complexity Digest, Bernard Ryefield
