Papers
435.8K views | +193 today

 Suggested by Joseph Lizier onto Papers

# Emergent Criticality through Adaptive Information Processing in Boolean Networks

We study information processing in populations of Boolean networks with evolving connectivity and systematically explore the interplay between the learning capability, robustness, the network topology, and the task complexity. We solve a long-standing open question and find computationally that, for large system sizes N, adaptive information processing drives the networks to a critical connectivity Kc=2. For finite size networks, the connectivity approaches the critical value with a power law of the system size N. We show that network learning and generalization are optimized near criticality, given that the task complexity and the amount of information provided surpass threshold values. Both random and evolved networks exhibit maximal topological diversity near Kc. We hypothesize that this diversity supports efficient exploration and robustness of solutions. Also reflected in our observation is that the variance of the fitness values is maximal in critical network populations. Finally, we discuss implications of our results for determining the optimal topology of adaptive dynamical networks that solve computational tasks.

No comment yet.

# Papers

Recent publications related to complex systems
 Scooped by Complexity Digest

Self-organization and adaptability are critical properties of complex adaptive systems (CAS), and their analysis provides insight into the design of these systems, consequently leading to real-world advancements. However, these properties are difficult to analyze in real-world scenarios due to performance constraints, metric design, and limitations in existing modeling tools. Several metrics have been proposed for their identification, but metric effectiveness under the same experimental settings has not been studied before. In this paper we present an observation tool, part of a complex adaptive systems modeling framework, that allows for the analysis of these metrics for large-scale complex models. We compare and contrast a wide range of metrics implemented in our observation tool. Our experimental analysis uses the classic model of Game of Life to provide a baseline for analysis, and a more complex Emergency Department model to further explore the suitability of these metrics and the modeling and analysis challenges faced when using them.

Lachlan Birdsey ; Claudia Szabo ; Katrina Falkner

Published in: Self-Adaptive and Self-Organizing Systems (SASO), 2017 IEEE 11th International Conference on

No comment yet.
 Scooped by Complexity Digest

## Network control principles predict neuron function in the Caenorhabditis elegans connectome

Recent studies on the controllability of complex systems offer a powerful mathematical framework to systematically explore the structure–function relationship in biological, social, and technological networks1, 2, 3. Despite theoretical advances, we lack direct experimental proof of the validity of these widely used control principles. Here we fill this gap by applying a control framework to the connectome of the nematode Caenorhabditis elegans4, 5, 6, allowing us to predict the involvement of each C. elegans neuron in locomotor behaviours. We predict that control of the muscles or motor neurons requires 12 neuronal classes, which include neuronal groups previously implicated in locomotion by laser ablation7, 8, 9, 10, 11, 12, 13, as well as one previously uncharacterized neuron, PDB. We validate this prediction experimentally, finding that the ablation of PDB leads to a significant loss of dorsoventral polarity in large body bends. Importantly, control principles also allow us to investigate the involvement of individual neurons within each neuronal class. For example, we predict that, within the class of DD motor neurons, only three (DD04, DD05, or DD06) should affect locomotion when ablated individually. This prediction is also confirmed; single cell ablations of DD04 or DD05 specifically affect posterior body movements, whereas ablations of DD02 or DD03 do not. Our predictions are robust to deletions of weak connections, missing connections, and rewired connections in the current connectome, indicating the potential applicability of this analytical framework to larger and less well-characterized connectomes.

Network control principles predict neuron function in the Caenorhabditis elegans connectomeNetwork control principles predict neuron function in the Caenorhabditis elegans connectome
Gang Yan, Petra E. Vértes, Emma K. Towlson, Yee Lian Chew, Denise S. Walker, William R. Schafer & Albert-László Barabási

Nature (2017) doi:10.1038/nature24056

No comment yet.
 Scooped by Complexity Digest

## Preliminary Steps Toward a Universal Economic Dynamics for Monetary and Fiscal Policy

We consider the relationship between economic activity and intervention, including monetary and fiscal policy, using a universal monetary and response dynamics framework. Central bank policies are designed for economic growth without excess inflation. However, unemployment, investment, consumption, and inflation are interlinked. Understanding dynamics is crucial to assessing the effects of policy, especially in the aftermath of the recent financial crisis. Here we lay out a program of research into monetary and economic dynamics and preliminary steps toward its execution. We use general principles of response theory to derive specific implications for policy. We find that the current approach, which considers the overall supply of money to the economy, is insufficient to effectively regulate economic growth. While it can achieve some degree of control, optimizing growth also requires a fiscal policy balancing monetary injection between two dominant loop flows, the consumption and wages loop, and investment and returns loop. The balance arises from a composite of government tax, entitlement, subsidy policies, corporate policies, as well as monetary policy. We further show that empirical evidence is consistent with a transition in 1980 between two regimes—from an oversupply to the consumption and wages loop, to an oversupply of the investment and returns loop. The imbalance is manifest in savings and borrowing by consumers and investors, and in inflation. The latter followed an increasing trend until 1980, and a decreasing one since then, resulting in a zero interest rate largely unrelated to the financial crisis. Three recessions and the financial crisis are part of this dynamic. Optimizing growth now requires shifting the balance. Our analysis supports advocates of greater income and / or government support for the poor who use a larger fraction of income for consumption. This promotes investment due to the growth in expenditures. Otherwise, investment has limited opportunities to gain returns above inflation so capital remains uninvested, and does not contribute to the growth of economic activity.

Yaneer Bar-Yam, Jean Langlois-Meurinne, Mari Kawakatsu, Rodolfo Garcia, Preliminary steps toward a universal economic dynamics for monetary and fiscal policy, New England Complex Systems Institute (October 10, 2017).

No comment yet.
 Scooped by Complexity Digest

## Evolution and Devolution of Social Complexity: Why Do We Care?

Over the past 10,000 years human societies evolved from “simple” – small egalitarian groups, integrated by face-to-face interactions – to “complex” – huge anonymous societies of millions, characterized by great differentials in wealth and power, extensive division of labor, elaborate governance structures, and sophisticated information systems. What were the evolutionary processes that brought about such an enormous increase in social scale and complexity?

We also need to understand why social forces that hold huge human societies together sometimes fail to do so. Complex societies collapsed on numerous occasions in the past, and may be at risk today. There are clear signs that even industrialized, wealthy, and democratic Western societies, that seemed to be immune to collapse until recently, are becoming less stable. Research on social complexity will bring understanding that is of direct value to our societies and human well-being.
Arjen ten Have's curator insight,
Interesting read on application of biological evolutionary insights into the evolution of society.
 Scooped by Complexity Digest

## Resilience management during large-scale epidemic outbreaks

Assessing and managing the impact of large-scale epidemics considering only the individual risk and severity of the disease is exceedingly difficult and could be extremely expensive. Economic consequences, infrastructure and service disruption, as well as the recovery speed, are just a few of the many dimensions along which to quantify the effect of an epidemic on society's fabric. Here, we extend the concept of resilience to characterize epidemics in structured populations, by defining the system-wide critical functionality that combines an individual's risk of getting the disease (disease attack rate) and the disruption to the system's functionality (human mobility deterioration). By studying both conceptual and data-driven models, we show that the integrated consideration of individual risks and societal disruptions under resilience assessment framework provides an insightful picture of how an epidemic might impact society. In particular, containment interventions intended for a straightforward reduction of the risk may have net negative impact on the system by slowing down the recovery of basic societal functions. The presented study operationalizes the resilience framework, providing a more nuanced and comprehensive approach for optimizing containment schemes and mitigation policies in the case of epidemic outbreaks.

Resilience management during large-scale epidemic outbreaks
Emanuele Massaro, Alexander Ganin, Nicola Perra, Igor Linkov, Alessandro Vespignani

No comment yet.
 Scooped by Complexity Digest

## Effects of motion in structured populations

In evolutionary processes, population structure has a substantial effect on natural selection. Here, we analyse how motion of individuals affects constant selection in structured populations. Motion is relevant because it leads to changes in the distribution of types as mutations march towards fixation or extinction. We describe motion as the swapping of individuals on graphs, and more generally as the shuffling of individuals between reproductive updates. Beginning with a one-dimensional graph, the cycle, we prove that motion suppresses natural selection for death–birth (DB) updating or for any process that combines birth–death (BD) and DB updating. If the rule is purely BD updating, no change in fixation probability appears in the presence of motion. We further investigate how motion affects evolution on the square lattice and weighted graphs. In the case of weighted graphs, we find that motion can be either an amplifier or a suppressor of natural selection. In some cases, whether it is one or the other can be a function of the relative reproductive rate, indicating that motion is a subtle and complex attribute of evolving populations. As a first step towards understanding less restricted types of motion in evolutionary graph theory, we consider a similar rule on dynamic graphs induced by a spatial flow and find qualitatively similar results, indicating that continuous motion also suppresses natural selection.

Effects of motion in structured populations
Madison S. Krieger, Alex McAvoy, Martin A. Nowak
Published 4 October 2017.DOI: 10.1098/rsif.2017.0509

No comment yet.
 Scooped by Complexity Digest

## The shape of collaborations

The structure of scientific collaborations has been the object of intense study both for its importance for innovation and scientific advancement, and as a model system for social group coordination and formation thanks to the availability of authorship data. Over the last years, complex networks approach to this problem have yielded important insights and shaped our understanding of scientific communities. In this paper we propose to complement the picture provided by network tools with that coming from using simplicial descriptions of publications and the corresponding topological methods. We show that it is natural to extend the concept of triadic closure to simplicial complexes and show the presence of strong simplicial closure. Focusing on the differences between scientific fields, we find that, while categories are characterized by different collaboration size distributions, the distributions of how many collaborations to which an author is able to participate is conserved across fields pointing to underlying attentional and temporal constraints. We then show that homological cycles, that can intuitively be thought as hole in the network fabric, are an important part of the underlying community linking structure.

The shape of collaborations
Alice PataniaEmail authorView ORCID ID profile, Giovanni Petri and Francesco Vaccarino
EPJ Data Science20176:18
https://doi.org/10.1140/epjds/s13688-017-0114-8

No comment yet.
 Scooped by Complexity Digest

## Estimating local commuting patterns from geolocated Twitter data

The emergence of large stores of transactional data generated by increasing use of digital devices presents a huge opportunity for policymakers to improve their knowledge of the local environment and thus make more informed and better decisions. A research frontier is hence emerging which involves exploring the type of measures that can be drawn from data stores such as mobile phone logs, Internet searches and contributions to social media platforms and the extent to which these measures are accurate reflections of the wider population. This paper contributes to this research frontier, by exploring the extent to which local commuting patterns can be estimated from data drawn from Twitter. It makes three contributions in particular. First, it shows that heuristics applied to geolocated Twitter data offer a good proxy for local commuting patterns; one which outperforms the current best method for estimating these patterns (the radiation model). This finding is of particular significance because we make use of relatively coarse geolocation data (at the city level) and use simple heuristics based on frequency counts. Second, it investigates sources of error in the proxy measure, showing that the model performs better on short trips with higher volumes of commuters; it also looks at demographic biases but finds that, surprisingly, measurements are not significantly affected by the fact that the demographic makeup of Twitter users differs significantly from the population as a whole. Finally, it looks at potential ways of going beyond simple frequency heuristics by incorporating temporal information into models.

Estimating local commuting patterns from geolocated Twitter data
Graham McNeill, Jonathan Bright, and Scott A Hale

EPJ Data Science 2017 6:24

https://doi.org/10.1140/epjds/s13688-017-0120-x

No comment yet.
 Suggested by Artem Kaznatcheev

## Complexity of evolutionary equilibria in static fitness landscapes

Experiments show that fitness landscapes can have a rich combinatorial structure due to epistasis and yet theory assumes that local peaks can be reached quickly. I introduce a distinction between easy landscapes where local fitness peaks can be found in a moderate number of steps and hard landscapes where finding evolutionary equilibria requires an infeasible amount of time. Hard examples exist even among landscapes with no reciprocal sign epistasis; on these, strong selection weak mutation dynamics cannot find the unique peak in polynomial time. On hard rugged fitness landscapes, no evolutionary dynamics -- even ones that do not follow adaptive paths -- can find a local fitness peak quickly; and the fitness advantage of nearby mutants cannot drop off exponentially fast but has to follow a power-law that long term evolution experiments have associated with unbounded growth in fitness. I present candidates for hard landscapes at scales from singles genes, to microbes, to complex organisms with costly learning (Baldwin effect). Even though hard landscapes are static and finite, local evolutionary equilibrium cannot be assumed.

No comment yet.
 Scooped by Complexity Digest

## Algorithmically probable mutations reproduce aspects of evolution such as convergence rate, genetic memory, modularity, diversity explosions, and mass extinction

We show that if evolution is algorithmic in any form and can thus be considered a program in software space, the emergence of a natural algorithmic probability distribution has the potential to become an accelerating mechanism. We simulate the application of algorithmic mutations to binary matrices based on numerical approximations to algorithmic probability, comparing the evolutionary speed to the alternative hypothesis of uniformly distributed mutations for a series of matrices of varying complexity. When the algorithmic mutation produces unfit organisms---because mutations may lead to, for example, syntactically useless evolutionary programs---massive extinctions may occur. We show that modularity provides an evolutionary advantage also evolving a genetic memory. We demonstrate that such regular structures are preserved and carried on when they first occur and can also lead to an accelerated production of diversity and extinction, possibly explaining natural phenomena such as periods of accelerated growth of the number of species (e.g. the Cambrian explosion) and the occurrence of massive extinctions (e.g. the End Triassic) whose causes are a matter of considerable debate. The approach introduced here appears to be a better approximation to actual biological evolution than models based upon the application of mutation from uniform probability distributions, and because evolution by algorithmic probability converges faster to regular structures (both artificial and natural, as tested on a small biological network), it also approaches a formal version of open-ended evolution based on previous results. The results validate the motivations and results of Chaitin's Metabiology programme. We also show that the procedure has the potential to significantly accelerate solving optimization problems in the context of artificial evolutionary algorithms.

Algorithmically probable mutations reproduce aspects of evolution such as convergence rate, genetic memory, modularity, diversity explosions, and mass extinction

Santiago Hernández-Orozco, Hector Zenil, Narsis A. Kiani

No comment yet.
 Scooped by Complexity Digest

## A physical model for efficient ranking in networks

We present a principled model and algorithm to infer a hierarchical ranking of nodes in directed networks. Unlike other methods such as minimum violation ranking, it assigns real-valued scores to nodes rather than simply ordinal ranks, and it formalizes the assumption that interactions are more likely to occur between individuals with similar ranks. It provides a natural framework for a statistical significance test for distinguishing when the inferred hierarchy is due to the network topology or is instead due to random chance, and it can be used to perform inference tasks such as predicting the existence or direction of edges. The ranking is inferred by solving a linear system of equations, which is sparse if the network is; thus the resulting algorithm is extremely efficient and scalable. We illustrate these findings by analyzing real and synthetic data and show that our method outperforms others, in both speed and accuracy, in recovering the underlying ranks and predicting edge directions.

A physical model for efficient ranking in networks
Caterina De Bacco, Daniel B. Larremore, Cristopher Moore

No comment yet.
 Scooped by Complexity Digest

## Evidence of complex contagion of information in social media: An experiment using Twitter bots

It has recently become possible to study the dynamics of information diffusion in techno-social systems at scale, due to the emergence of online platforms, such as Twitter, with millions of users. One question that systematically recurs is whether information spreads according to simple or complex dynamics: does each exposure to a piece of information have an independent probability of a user adopting it (simple contagion), or does this probability depend instead on the number of sources of exposure, increasing above some threshold (complex contagion)? Most studies to date are observational and, therefore, unable to disentangle the effects of confounding factors such as social reinforcement, homophily, limited attention, or network community structure. Here we describe a novel controlled experiment that we performed on Twitter using ‘social bots’ deployed to carry out coordinated attempts at spreading information. We propose two Bayesian statistical models describing simple and complex contagion dynamics, and test the competing hypotheses. We provide experimental evidence that the complex contagion model describes the observed information diffusion behavior more accurately than simple contagion. Future applications of our results include more effective defenses against malicious propaganda campaigns on social media, improved marketing and advertisement strategies, and design of effective network intervention techniques.

Mønsted B, Sapieżyński P, Ferrara E, Lehmann S (2017) Evidence of complex contagion of information in social media: An experiment using Twitter bots. PLoS ONE 12(9): e0184148. https://doi.org/10.1371/journal.pone.0184148

No comment yet.
 Scooped by Complexity Digest

## Peer punishment promotes enforcement of bad social norms

Social norms are an important element in explaining how humans achieve very high levels of cooperative activity. It is widely observed that, when norms can be enforced by peer punishment, groups are able to resolve social dilemmas in prosocial, cooperative ways. Here we show that punishment can also encourage participation in destructive behaviours that are harmful to group welfare, and that this phenomenon is mediated by a social norm. In a variation of a public goods game, in which the return to investment is negative for both group and individual, we find that the opportunity to punish led to higher levels of contribution, thereby harming collective payoffs. A second experiment confirmed that, independently of whether punishment is available, a majority of subjects regard the efficient behaviour of non-contribution as socially inappropriate. The results show that simply providing a punishment opportunity does not guarantee that punishment will be used for socially beneficial ends, because the social norms that influence punishment behaviour may themselves be destructive.

Peer punishment promotes enforcement of bad social norms
Klaus Abbink, Lata Gangadharan, Toby Handfield & John Thrasher
Nature Communications 8, Article number: 609 (2017)
doi:10.1038/s41467-017-00731-0

No comment yet.
 Scooped by Complexity Digest

## The misleading narrative of the canonical faculty productivity trajectory

Scholarly productivity impacts nearly every aspect of a researcher’s career, from their initial placement as faculty to funding and tenure decisions. Historically, expectations for individuals rely on 60 years of research on aggregate trends, which suggest that productivity rises rapidly to an early-career peak and then gradually declines. Here we show, using comprehensive data on the publication and employment histories of an entire field of research, that the canonical narrative of “rapid rise, gradual decline” describes only about one-fifth of individual faculty, and the remaining four-fifths exhibit a rich diversity of productivity patterns. This suggests existing models and expectations for faculty productivity require revision, as they capture only one of many ways to have a successful career in science.

The misleading narrative of the canonical faculty productivity trajectory
Samuel F. Way, Allison C. Morgan, Aaron Clauset, and Daniel B. Larremore

No comment yet.
 Scooped by Complexity Digest

## Mastering the game of Go without human knowledge

A long-standing goal of artificial intelligence is an algorithm that learns, tabula rasa, superhuman proficiency in challenging domains. Recently, AlphaGo became the first program to defeat a world champion in the game of Go. The tree search in AlphaGo evaluated positions and selected moves using deep neural networks. These neural networks were trained by supervised learning from human expert moves, and by reinforcement learning from self-play. Here we introduce an algorithm based solely on reinforcement learning, without human data, guidance or domain knowledge beyond game rules. AlphaGo becomes its own teacher: a neural network is trained to predict AlphaGo’s own move selections and also the winner of AlphaGo’s games. This neural network improves the strength of the tree search, resulting in higher quality move selection and stronger self-play in the next iteration. Starting tabula rasa, our new program AlphaGo Zero achieved superhuman performance, winning 100–0 against the previously published, champion-defeating AlphaGo.

Mastering the game of Go without human knowledgeMastering the game of Go without human knowledge
David Silver, Julian Schrittwieser, Karen Simonyan, Ioannis Antonoglou, Aja Huang, Arthur Guez, Thomas Hubert, Lucas Baker, Matthew Lai, Adrian Bolton, Yutian Chen, Timothy Lillicrap, Fan Hui, Laurent Sifre, George van den Driessche, Thore Graepel & Demis Hassabis

Nature 550, 354–359 (19 October 2017) doi:10.1038/nature24270

Complexity Digest's insight:

It might be argued that since Alpha Go learned from human knowledge and Alpha Go Zero learned from Alpha Go, then Alpha Go Zero does require (indirect) human knowledge. Still, the results are impressive and relevant.

No comment yet.
 Scooped by Complexity Digest

## Neutron-Star Collision Shakes Space-Time and Lights Up the Sky

In the days after Aug. 17, astronomers made successful observations of the colliding neutron stars with optical, radio, X-ray, gamma-ray, infrared and ultraviolet telescopes. The enormous collaborative effort, detailed today in dozens of papers appearing simultaneously in Physical Review Letters, Nature, Science, Astrophysical Journal Letters and other journals, has not only allowed astrophysicists to piece together a coherent account of the event, but also to answer longstanding questions in astrophysics.
No comment yet.
 Scooped by Complexity Digest

## Mobility can promote the evolution of cooperation via emergent self-assortment dynamics

Cooperation among animals is ubiquitous. In a cooperative interaction, the cooperator confers a benefit to its partner at a personal cost. How does natural selection favour such a costly behaviour? Classical theories argue that cooperative interactions among genetic relatives, reciprocal cooperators, or among individuals within groups in viscous population structures are necessary to maintain cooperation. However, many organisms are mobile, and live in dynamic (fission-fusion) groups that constantly merge and split. In such populations, the above mechanisms may be inadequate to explain cooperation. Here, we develop a minimal model that explicitly accounts for mobility and cohesion among organisms. We find that mobility can support cooperation via emergent dynamic groups, even in the absence of previously known mechanisms. Our results may offer insights into the evolution of cooperation in animals that live in fission fusion groups, such as birds, fish or mammals, or microbes living in turbulent media, such as in oceans or in the bloodstreams of animal hosts.

Joshi J, Couzin ID, Levin SA, Guttal V (2017) Mobility can promote the evolution of cooperation via emergent self-assortment dynamics. PLoS Comput Biol13(9): e1005732. https://doi.org/10.1371/journal.pcbi.1005732

Arjen ten Have's curator insight,
Cooperation is always explained by Hamilton's rule (Benefit > Relationship*Cost) but that does not hold when mobility is taken into account. Here a model is presented that deals with mobility. Although modeled for animals, this obviously goes for Human society as well.
 Scooped by Complexity Digest

## The Strength of Absent Ties: Social Integration via Online Dating

We used to marry people to which we were somehow connected to: friends of friends, schoolmates, neighbours. Since we were more connected to people similar to us, we were likely to marry someone from our own race.
However, online dating has changed this pattern: people who meet online tend to be complete strangers. Given that one-third of modern marriages start online, we investigate theoretically, using random graphs and matching theory, the effects of those previously absent ties in the diversity of modern societies.
We find that when a society benefits from previously absent ties, social integration occurs rapidly, even if the number of partners met online is small. Our findings are consistent with the sharp increase in interracial marriages in the U.S. in the last two decades.

The Strength of Absent Ties: Social Integration via Online Dating
Josue Ortega, Philipp Hergovich

No comment yet.
 Scooped by Complexity Digest

## Where is technology taking the economy?

We are creating an intelligence that is external to humans and housed in the virtual economy. This is bringing us into a new economic era—a distributive one—where different rules apply.

Where is technology taking the economy?
By W. Brian Arthur

McKinsey Quaterly

No comment yet.
 Scooped by Complexity Digest

## Data-driven modeling of collaboration networks: a cross-domain analysis

We analyze large-scale data sets about collaborations from two different domains: economics, specifically 22,000 R&D alliances between 14,500 firms, and science, specifically 300,000 co-authorship relations between 95,000 scientists. Considering the different domains of the data sets, we address two questions: (a) to what extent do the collaboration networks reconstructed from the data share common structural features, and (b) can their structure be reproduced by the same agent-based model. In our data-driven modeling approach we use aggregated network data to calibrate the probabilities at which agents establish collaborations with either newcomers or established agents. The model is then validated by its ability to reproduce network features not used for calibration, including distributions of degrees, path lengths, local clustering coefficients and sizes of disconnected components. Emphasis is put on comparing domains, but also sub-domains (economic sectors, scientific specializations). Interpreting the link probabilities as strategies for link formation, we find that in R&D collaborations newcomers prefer links with established agents, while in co-authorship relations newcomers prefer links with other newcomers. Our results shed new light on the long-standing question about the role of endogenous and exogenous factors (i.e., different information available to the initiator of a collaboration) in network formation.

Data-driven modeling of collaboration networks: a cross-domain analysis
Mario V Tomasello, Giacomo VaccarioEmail authorView ORCID ID profile and Frank Schweitzer
EPJ Data Science20176:22
https://doi.org/10.1140/epjds/s13688-017-0117-5

No comment yet.
 Scooped by Complexity Digest

## Rapid rise and decay in petition signing

Contemporary collective action, much of which involves social media and other Internet-based platforms, leaves a digital imprint which may be harvested to better understand the dynamics of mobilization. Petition signing is an example of collective action which has gained in popularity with rising use of social media and provides such data for the whole population of petition signatories for a given platform. This paper tracks the growth curves of all 20,000 petitions to the UK government petitions website (http://epetitions.direct.gov.uk) and 1,800 petitions to the US White House site (https://petitions.whitehouse.gov), analyzing the rate of growth and outreach mechanism. Previous research has suggested the importance of the first day to the ultimate success of a petition, but has not examined early growth within that day, made possible here through hourly resolution in the data. The analysis shows that the vast majority of petitions do not achieve any measure of success; over 99 percent fail to get the 10,000 signatures required for an official response and only 0.1 percent attain the 100,000 required for a parliamentary debate (0.7 percent in the US). We analyze the data through a multiplicative process model framework to explain the heterogeneous growth of signatures at the population level. We define and measure an average outreach factor for petitions and show that it decays very fast (reducing to 0.1% after 10 hours in the UK and 30 hours in the US). After a day or two, a petition’s fate is virtually set. The findings challenge conventional analyses of collective action from economics and political science, where the production function has been assumed to follow an S-shaped curve.

Rapid rise and decay in petition signing
Taha YasseriEmail author, Scott A Hale and Helen Z Margetts
EPJ Data Science20176:20
https://doi.org/10.1140/epjds/s13688-017-0116-6

No comment yet.
 Scooped by Complexity Digest

## Reliable uncertainties in indirect measurements

In this article we present very intuitive, easy to follow, yet mathematically rigorous, approach to the so called data fitting process. Rather than minimizing the distance between measured and simulated data points, we prefer to find such an area in searched parameters' space that generates simulated curve crossing as many acquired experimental points as possible, but at least half of them. Such a task is pretty easy to attack with interval calculations. The problem is, however, that interval calculations operate on guaranteed intervals, that is on pairs of numbers determining minimal and maximal values of measured quantity while in vast majority of cases our measured quantities are expressed rather as a pair of two other numbers: the average value and its standard deviation. Here we propose the combination of interval calculus with basic notions from probability and statistics. This approach makes possible to obtain the results in familiar form as reliable values of searched parameters, their standard deviations, and their correlations as well. There are no assumptions concerning the probability density distributions of experimental values besides the obvious one that their variances are finite. Neither the symmetry of uncertainties of experimental distributions is required (assumed) nor those uncertainties have to be `small.' As a side effect, outliers are quietly and safely ignored, even if numerous.

Reliable uncertainties in indirect measurements
Marek W. Gutowski

No comment yet.
 Scooped by Complexity Digest

## Reshaping Business With Artificial Intelligence

Expectations for artificial intelligence (AI) are sky-high, but what are businesses actually doing now? The goal of this report is to present a realistic baseline that allows companies to compare their AI ambitions and efforts. Building on data rather than conjecture, the research is based on a global survey of more than 3,000 executives, managers, and analysts across industries and in-depth interviews with more than 30 technology experts and executives.

No comment yet.
 Scooped by Complexity Digest

## Relatedness, Knowledge Diffusion, and the Evolution of Bilateral Trade

During the last decades two important contributions have reshaped our understanding of international trade. First, countries trade more with those with whom they share history, language, and culture, suggesting that trade is limited by information frictions. Second, countries are more likely to start exporting products that are similar to their current exports, suggesting that knowledge diffusion among related industries is a key constrain shaping the diversification of exports. But does knowledge about how to export to a destination also diffuses among related products and geographic neighbors? Do countries need to learn how to trade each product to each destination? Here, we use bilateral trade data from 2000 to 2015 to show that countries are more likely to increase their exports of a product to a destination when: (i) they export related products to it, (ii) they export the same product to the neighbor of a destination, (iii) they have neighbors who export the same product to that destination. Then, we explore the magnitude of these effects for new, nascent, and experienced exporters, (exporters with and without comparative advantage in a product) and also for groups of products with different level of technological sophistication. We find that the effects of product and geographic relatedness are stronger for new exporters, and also, that the effect of product relatedness is stronger for more technologically sophisticated products. These findings support the idea that international trade is shaped by information frictions that are reduced in the presence of related products and experienced geographic neighbors.

Relatedness, Knowledge Diffusion, and the Evolution of Bilateral Trade
Bogang Jun, Aamena Alshamsi, Jian Gao, Cesar A Hidalgo

No comment yet.
 Scooped by Complexity Digest

## Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, $H(p)=\ensuremath{-}{\ensuremath{\sum}}_{i}{p}_{i}log{p}_{i}$. For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as ${S}_{\mathrm{EXT}}$ for extensive entropy, ${S}_{\mathrm{IT}}$ for the source information rate in information theory, and ${S}_{\mathrm{MEP}}$ for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for P\'olya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.
No comment yet.