The distribution of firms' growth and firms' sizes is a topic under intense scrutiny. In this paper we show that a thermodynamic model based on the Maximum Entropy Principle, with dynamical prior information, can be constructed that adequately describes the dynamics and distribution of firms' growth. Our theoretical framework is tested against a comprehensive data-base of Spanish firms, which covers to a very large extent Spain's economic activity with a total of 1,155,142 firms evolving along a full decade. We show that the empirical exponent of Pareto's law, a rule often observed in the rank distribution of large-size firms, is explained by the capacity of the economic system for creating/destroying firms, and can be used to measure the health of a capitalist-based economy. Indeed, our model predicts that when the exponent is larger that 1, creation of firms is favored; when it is smaller that 1, destruction of firms is favored instead; and when it equals 1 (matching Zipf's law), the system is in a full macroeconomic equilibrium, entailing "free" creation and/or destruction of firms. For medium and smaller firm-sizes, the dynamical regime changes; the whole distribution can no longer be fitted to a single simple analytic form and numerical prediction is required. Our model constitutes the basis of a full predictive framework for the economic evolution of an ensemble of firms that can be potentially used to develop simulations and test hypothetical scenarios, as economic crisis or the response to specific policy measures.
Thermodynamics of firms' growth Eduardo Zambrano, Alberto Hernando, Aurelio Fernandez-Bariviera, Ricardo Hernando, Angelo Plastino
Detailed knowledge of the energy needs at relatively high spatial and temporal resolution is crucial for the electricity infrastructure planning of a region. However, such information is typically limited by the scarcity of data on human activities, in particular in developing countries where electrification of rural areas is sought. The analysis of society-wide mobile phone records has recently proven to offer unprecedented insights into the spatio-temporal distribution of people, but this information has never been used to support electrification planning strategies anywhere and for rural areas in developing countries in particular. The aim of this project is the assessment of the contribution of mobile phone data for the development of bottom-up energy demand models, in order to enhance energy planning studies and existing electrification practices. More specifically, this work introduces a framework that combines mobile phone data analysis, socioeconomic and geo-referenced data analysis, and state-of-the-art energy infrastructure engineering techniques to assess the techno-economic feasibility of different centralized and decentralized electrification options for rural areas in a developing country. Specific electrification options considered include extensions of the existing medium voltage (MV) grid, diesel engine-based community-level Microgrids, and individual household-level solar photovoltaic (PV) systems. The framework and relevant methodology are demonstrated throughout the paper using the case of Senegal and the mobile phone data made available for the 'D4D-Senegal' innovation challenge. The results are extremely encouraging and highlight the potential of mobile phone data to support more efficient and economically attractive electrification plans.
Using Mobile Phone Data for Electricity Infrastructure Planning Eduardo Alejandro Martinez-Cesena, Pierluigi Mancarella, Mamadou Ndiaye, Markus Schläpfer
Complexity and the Arrow of Time [Charles H. Lineweaver, Paul C. W. Davies, Michael Ruse] on Amazon.com. *FREE* shipping on qualifying offers. There is a widespread assumption that the universe in general, and life in particular, is 'getting more complex with time'. This book brings together a wide range of experts in science
In the seminal work 'An Evolutionary Approach to Norms', Axelrod identified internalization as one of the key mechanisms that supports the spreading and stabilization of norms. But how does this process work? This paper advocates a rich cognitive model of different types, degrees and factors of norm internalization. Rather than a none-or-all phenomenon, we claim that norm internalization is a dynamic process, whose deepest step occurs when norms are complied with thoughtlessly. In order to implement a theoretical model of internalization and check its effectiveness in sustaining social norms and promoting cooperation, a simulated web-service distributed market has been designed, where both services and agents' tasks are dynamically assigned. Internalizers are compared with agents whose behaviour is driven only by self-interested motivations. Simulation findings show that in dynamic unpredictable scenarios, internalizers prove more adaptive and achieve higher level of cooperation than agents whose decision-making is based only on utility calculation.
Self-Policing Through Norm Internalization: A Cognitive Solution to the Tragedy of the Digital Commons in Social Networks by Daniel Villatoro, Giulia Andrighetto, Rosaria Conte and Jordi Sabater-Mir http://jasss.soc.surrey.ac.uk/18/2/2.html
The rapid changes occurring in the higher education domain are placing increasing pressure on the actors in this space to focus efforts on identifying and adopting strategies for success. One particular group of interest are academics or scientists, and the ways that these individuals, or collectives as institutional or discipline-based science systems, make decisions about how best to achieve success in their chosen field. The agent-based model and simulation that we present draws on the hypothetical "strategic publication model" proposed by Mölders, Fink and Weyer (2011), and extends this work by defining experimental settings to implement a prototype ABMS in NetLogo. While considerable work remains to fully resolve theoretical issues relating to the scope, calibration and validation of the model, this work goes some way toward resolving some of the details associated with defining appropriate experimental settings. Also presented are the results of four experiments that focus on exploring the emergent effects of the system that result from varying the strategic mix of actors in the system.
In the last few years, electronic media brought a revolution in the traceability of social phenomena. As particles in a bubble chamber, social trajectories leave digital trails that can be analyzed to gain a deeper understanding of collective life. To make sense of these traces a renewed collaboration between social and natural scientists is needed. In this paper, we claim that current research strategies based on micro-macro models are unfit to unfold the complexity of collective existence and that the priority should instead be the development of new formal tools to exploit the richness of digital data.
Cooperative game theory deals with situations where objectives of participants of the game are partially cooperative and partially conflicting. It is in the interest of participants to cooperate in the sense of making binding agreements to achieve the maximum possible benefit. When it comes to distribution of benefit/payoffs, participants have conflicting interests. Such situations are usually modelled as cooperative games. While the book mainly discusses transferable utility games, there is also a brief analysis of non-transferable utility games. Alternative solution concepts to cooperative game theoretic problems are presented in chapters 1-9 and the next four chapters present issues related to computations of solutions discussed in the earlier chapters. The proofs of all results presented in the book are quite explicit. Additionally the mathematical techniques employed in demonstrating the results will be helpful to those who wish to learn application of mathematics for solving problems in game theory.
How do shared conventions emerge in complex decentralized social systems? This question engages fields as diverse as linguistics, sociology, and cognitive science. Previous empirical attempts to solve this puzzle all presuppose that formal or informal institutions, such as incentives for global agreement, coordinated leadership, or aggregated information about the population, are needed to facilitate a solution. Evolutionary theories of social conventions, by contrast, hypothesize that such institutions are not necessary in order for social conventions to form. However, empirical tests of this hypothesis have been hindered by the difficulties of evaluating the real-time creation of new collective behaviors in large decentralized populations. Here, we present experimental results—replicated at several scales—that demonstrate the spontaneous creation of universally adopted social conventions and show how simple changes in a population’s network structure can direct the dynamics of norm formation, driving human populations with no ambition for large scale coordination to rapidly evolve shared social conventions.
Contemporary complexity theory has been instrumental in providing novel rigorous definitions for some classic philosophical concepts, including emergence. In an attempt to provide an account of emergence that is consistent with complexity and dynamical systems theory, several authors have turned to the notion of constraints on state transitions. Drawing on complexity theory directly, this paper builds on those accounts, further developing the constraint-based interpretation of emergence and arguing that such accounts recover many of the features of more traditional accounts. We show that the constraint-based account of emergence also leads naturally into a meaningful definition of self-organization, another concept that has received increasing attention recently. Along the way, we distinguish between order and organization, two concepts which are frequently conflated. Finally, we consider possibilities for future research in the philosophy of complex systems, as well as applications of the distinctions made in this paper.
Self-Organization, Emergence, and Constraint in Complex Natural Systems Jonathan Lawhead
We analyze the replicator-mutator equations for the Rock-Paper-Scissors game. Various graph-theoretic patterns of mutation are considered, ranging from a single unidirectional mutation pathway between two of the species, to global bidirectional mutation among all the species. Our main result is that the coexistence state, in which all three species exist in equilibrium, can be destabilized by arbitrarily small mutation rates. After it loses stability, the coexistence state gives birth to a stable limit cycle solution created in a supercritical Hopf bifurcation. This attracting periodic solution exists for all the mutation patterns considered, and persists arbitrarily close to the limit of zero mutation rate and a zero-sum game.
Nonlinear Dynamics of the Rock-Paper-Scissors Game with Mutations Danielle F. P. Toupo, Steven H. Strogatz
All systems in nature have one thing in common: they process information. Information is registered in the state of a system and its elements, implicitly and invisibly. As elements interact, information is transferred and modified. Indeed, bits of information about the state of one element will travel—imperfectly—to the state of the other element, forming its new state. This storage, transfer, and modification of information, possibly between levels of a multi level system, is imperfect due to randomness or noise. From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions. Mapping out exactly how these bits of information percolate through the system could reveal new fundamental insights in how the parts orchestrate to produce the properties of the system. A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with disparate semantics, would reduce to a single language of information processing.
Guest Editor: Dr. Rick Quax
Deadline for manuscript submissions: 28 February 2015
Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.
Introduced by the late Per Bak and his colleagues, self-organized criticality (SOC) has been one of the most stimulating concepts to come out of statistical mechanics and condensed matter theory in the last few decades, and has played a significant role in the development of complexity science. SOC, and more generally fractals and power laws, have attacted much comment, ranging from the very positive to the polemical. The other papers in this special issue (Aschwanden et al, 2014; McAteer et al, 2014; Sharma et al, 2015) showcase the considerable body of observations in solar, magnetospheric and fusion plasma inspired by the SOC idea, and expose the fertile role the new paradigm has played in approaches to modeling and understanding multiscale plasma instabilities. This very broad impact, and the necessary process of adapting a scientific hypothesis to the conditions of a given physical system, has meant that SOC as studied in these fields has sometimes differed significantly from the definition originally given by its creators. In Bak's own field of theoretical physics there are significant observational and theoretical open questions, even 25 years on (Pruessner, 2012). One aim of the present review is to address the dichotomy between the great reception SOC has received in some areas, and its shortcomings, as they became manifest in the controversies it triggered. Our article tries to clear up what we think are misunderstandings of SOC in fields more remote from its origins in statistical mechanics, condensed matter and dynamical systems by revisiting Bak, Tang and Wiesenfeld's original papers.
25 Years of Self-Organized Criticality: Concepts and Controversies Nicholas Watkins, Gunnar Pruessner, Sandra Chapman, Norma Bock Crosby, Henrik Jensen
Online social networks represent a popular and highly diverse class of social media systems. Despite this variety, each of these systems undergoes a general process of online social network assembly, which represents the complicated and heterogeneous changes that transform newly born systems into mature platforms. However, little is known about this process. For example, how much of a network's assembly is driven by simple growth? How does a network's structure change as it matures? How does network structure vary with adoption rates and user heterogeneity, and do these properties play different roles at different points in the assembly? We investigate these and other questions using a unique dataset of online connections among the roughly one million users at the first 100 colleges admitted to Facebook, captured just 20 months after its launch. We first show that different vintages and adoption rates across this population of networks reveal temporal dynamics of the assembly process, and that assembly is only loosely related to network growth. We then exploit natural experiments embedded in this dataset and complementary data obtained via Internet archaeology to show that different subnetworks, e.g., among students and among alumni, matured at different rates toward similar end states. These results shed new light on the processes and patterns of online social network assembly, and may facilitate more effective design for online social systems.
Assembling thefacebook: Using heterogeneity to understand online social network assembly Abigail Z. Jacobs, Samuel F. Way, Johan Ugander, Aaron Clauset
Cooperation lies at the foundations of human societies, yet why people cooperate remains a conundrum. The issue, known as network reciprocity, of whether population structure can foster cooperative behavior in social dilemmas has been addressed by many, but theoretical studies have yielded contradictory results so far—as the problem is very sensitive to how players adapt their strategy. However, recent experiments with the prisoner's dilemma game played on different networks and in a specific range of payoffs suggest that humans, at least for those experimental setups, do not consider neighbors' payoffs when making their decisions, and that the network structure does not influence the final outcome. In this work we carry out an extensive analysis of different evolutionary dynamics, taking into account most of the alternatives that have been proposed so far to implement players' strategy updating process. In this manner we show that the absence of network reciprocity is a general feature of the dynamics (among those we consider) that do not take neighbors' payoffs into account. Our results, together with experimental evidence, hint at how to properly model real people's behavior.
Remember domino theory? One country going Communist was supposed to topple the next, and then the next, and the next. The metaphor drove much of United States foreign policy in the middle of the 20th century. But it had the wrong name. From a physical point of view, it should have been called the “sandpile theory.” Real-world political phase transitions tend to happen not in neat sequences, but in sudden coordinated fits, like the Arab Spring, or the collapse of the Eastern Bloc. These reflect quiet periods punctuated by crises—like a sandpile. You can add grains of sand to the top of a sandpile for a while, to no apparent effect. Then, all at once, an avalanche sweeps sand down from the top in an irregular pattern, possibly setting off little sub-avalanches as it goes.
The world is changing at an ever-increasing pace. And it has changed in a much more fundamental way than one would think, primarily because it has become more connected and interdependent than in our entire history. Every new product, every new invention can be combined with those that existed before, thereby creating an explosion of complexity: structural complexity, dynamic complexity, functional complexity, and algorithmic complexity. How to respond to this challenge?
Responding to Complexity in Socio-Economic Systems: How to Build a Smart and Resilient Society?
The interactions between Computer Science and the Social Sciences have grown fruitfully along the past 20 years. The mutual benefits of such a cross-fertilization stand as well at a conceptual, technological or methodological level. Economics in particular benefited from innovations in multi-agent systems in Computer Science leading to agent-based computational economics and in return the multi-agent systems benefited for instance of economic researches related to mechanisms of incentives and regulation to design self-organized systems. Created 10 years ago, in 2005 in Lille (France) by Philippe Matthieu and his team, the Artificial Economics conference series reveals the liveliness of the collaborations and exchanges among computer scientists and economists in particular. The excellent quality of this conference has been recognized since its inception and its proceedings have been regularly published in Springer’s Lecture Notes in Economics and Mathematical Systems series. At about the same period, the European Social Simulation Association was created and decided to support an annual conference dedicated to computational approaches of the social sciences. Both communities kept going alongside for the past ten years presenting evident overlaps concerning either their approaches or their members. This year, both conferences have decided to join their efforts and hold a common conference, Social Simulation Conference, in Barcelona, Spain, 1st to 5th September 2014 which will host the 10th edition of the Artificial Economics Conference. In this edition, 32 submissions from 11 countries were received, from which we selected 20 for presentation (near 60 % acceptance). The papers have then been revised and extended and 19 papers were selected in order to make part of this volume.
The question What is Complexity? has occupied a great deal of time and paper over the last 20 or so years. There are a myriad different perspectives and definitions but still no consensus. In this paper I take a phenomenological approach, identifying several factors that discriminate well between systems that would be consensually agreed to be simple versus others that would be consensually agreed to be complex - biological systems and human languages. I argue that a crucial component is that of structural building block hierarchies that, in the case of complex systems, correspond also to a functional hierarchy. I argue that complexity is an emergent property of this structural/functional hierarchy, induced by a property - fitness in the case of biological systems and meaning in the case of languages - that links the elements of this hierarchy across multiple scales. Additionally, I argue that non-complex systems "are" while complex systems "do" so that the latter, in distinction to physical systems, must be described not only in a space of states but also in a space of update rules (strategies) which we do not know how to specify. Further, the existence of structural/functional building block hierarchies allows for the functional specialisation of structural modules as amply observed in nature. Finally, we argue that there is at least one measuring apparatus capable of measuring complexity as characterised in the paper - the human brain itself.
Zipf's law is just one out of many universal laws proposed to describe statistical regularities in language. Here we review and critically discuss how these laws can be statistically interpreted, fitted, and tested (falsified). The modern availability of large databases of written text allows for tests with an unprecedent statistical accuracy and also a characterization of the fluctuations around the typical behavior. We find that fluctuations are usually much larger than expected based on simplifying statistical assumptions (e.g., independence and lack of correlations between observations).These simplifications appear also in usual statistical tests so that the large fluctuations can be erroneously interpreted as a falsification of the law. Instead, here we argue that linguistic laws are only meaningful (falsifiable) if accompanied by a model for which the fluctuations can be computed (e.g., a generative model of the text). The large fluctuations we report show that the constraints imposed by linguistic laws on the creativity process of text generation are not as tight as one could expect.
Statistical laws in linguistics Eduardo G. Altmann, Martin Gerlach
To maintain stability yet retain the flexibility to adapt to changing circumstances, social systems must strike a balance between the maintenance of a shared reality and the survival of minority opinion. A computational model is presented that investigates the interplay of two basic, oppositional social processes—conformity and anticonformity—in promoting the emergence of this balance. Computer simulations employing a cellular automata platform tested hypotheses concerning the survival of minority opinion and the maintenance of system stability for different proportions of anticonformity. Results revealed that a relatively small proportion of anticonformists facilitated the survival of a minority opinion held by a larger number of conformists who would otherwise succumb to pressures for social consensus. Beyond a critical threshold, however, increased proportions of anticonformists undermined social stability. Understanding the adaptive benefits of balanced oppositional forces has implications for optimal functioning in psychological and social processes in general.
The Critical Few: Anticonformists at the Crossroads of Minority Opinion Survival and Collapse by Matthew Jarman, Andrzej Nowak, Wojciech Borkowski, David Serfass, Alexander Wong and Robin Vallacher http://jasss.soc.surrey.ac.uk/18/1/6.html
During the 1960s but mainly in the 1970s, large mathematical dynamic global models were implemented in computers to simulate the entire world, or large portions of it. Several different but interrelated subjects were considered simultaneously, and their variables evolved over time in an attempt to forecast the future, considering decades as time horizons. Global models continued to be developed while evidencing an increasing bias towards environmental aspects, or at least the public impact of models with such a focus became prevalent. In this paper we analyze the early evolution of computer-based global modeling and provide insights on less known pioneering works by South American modelers in the 1960s (Varsavsky and collaborators). We revisit relevant methodological aspects and discuss how they influenced different modeling endeavors. Finally, we overview how distinctive systemic approaches in global modeling evolved into the currently well-established discipline of complex systems.
(2003). Organizations as self-organizing and sustaining systems: a complex and autopoietic systems perspective. International Journal of General Systems: Vol. 32, No. 5, pp. 459-474. doi: 10.1080/0308107031000135017
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.