Papers
Follow
Find
251.4K views | +183 today
 
Scooped by Complexity Digest
onto Papers
Scoop.it!

Collective Phenomena and Non-Finite State Computation in a Human Social System

We investigate the computational structure of a paradigmatic example of distributed social interaction: that of the open-source Wikipedia community. We examine the statistical properties of its cooperative behavior, and perform model selection to determine whether this aspect of the system can be described by a finite-state process, or whether reference to an effectively unbounded resource allows for a more parsimonious description. We find strong evidence, in a majority of the most-edited pages, in favor of a collective-state model, where the probability of a “revert” action declines as the square root of the number of non-revert actions seen since the last revert. We provide evidence that the emergence of this social counter is driven by collective interaction effects, rather than properties of individual users.

 

DeDeo S (2013) Collective Phenomena and Non-Finite State Computation in a Human Social System. PLoS ONE 8(10): e75818. http://dx.doi.org/10.1371/journal.pone.0075818

more...
No comment yet.
Papers
Recent publications related to complex systems
Your new post is loading...
Your new post is loading...
Scooped by Complexity Digest
Scoop.it!

Synchronization as Aggregation: Cluster Kinetics of Pulse-Coupled Oscillators

We consider models of identical pulse-coupled oscillators with global interactions. Previous work showed that under certain conditions such systems always end up in sync, but did not quantify how small clusters of synchronized oscillators progressively coalesce into larger ones. Using tools from the study of aggregation phenomena, we obtain exact results for the time-dependent distribution of cluster sizes as the system evolves from disorder to synchrony.


Synchronization as Aggregation: Cluster Kinetics of Pulse-Coupled Oscillators
Kevin P. O'Keeffe, Pavel L. Krapivsky, Steven H. Strogatz

http://arxiv.org/abs/1501.04115

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Expectations of brilliance underlie gender distributions across academic disciplines

The gender imbalance in STEM subjects dominates current debates about women’s underrepresentation in academia. However, women are well represented at the Ph.D. level in some sciences and poorly represented in some humanities (e.g., in 2011, 54% of U.S. Ph.D.’s in molecular biology were women versus only 31% in philosophy). We hypothesize that, across the academic spectrum, women are underrepresented in fields whose practitioners believe that raw, innate talent is the main requirement for success, because women are stereotyped as not possessing such talent. This hypothesis extends to African Americans’ underrepresentation as well, as this group is subject to similar stereotypes. Results from a nationwide survey of academics support our hypothesis (termed the field-specific ability beliefs hypothesis) over three competing hypotheses.


Expectations of brilliance underlie gender distributions across academic disciplines
Sarah-Jane Leslie, Andrei Cimpian, Meredith Meyer, Edward Freeland

Science 16 January 2015:
Vol. 347 no. 6219 pp. 262-265
http://dx.doi.org/10.1126/science.1261375 

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Degeneracy: Demystifying and destigmatizing a core concept in systems biology

Often relegated to the methods section of genetic research articles, the term “degeneracy” is regularly misunderstood and its theoretical significance widely understated. Degeneracy describes the ability of different structures to be conditionally interchangeable in their contribution to system functions. Frequently mislabeled redundancy, degeneracy refers to structural variation whereas redundancy refers to structural duplication. Sources of degeneracy include, but are not limited to, (1) duplicate structures that differentiate yet remain isofunctional, (2) unrelated isofunctional structures that are dispersed endogenously or exogenously, (3) variable arrangements of interacting structures that achieve the same output through multiple pathways, and (4) parcellation of a structure into subunits that can still variably perform the same initial function. The ability to perform the same function by drawing upon an array of dissimilar structures contributes advantageously to the integrity of a system. Drawing attention to the heterogeneous construction of living systems by highlighting the concept of degeneracy valuably enhances the ways scientists think about self-organization, robustness, and complexity. Labels in science, however, can sometimes be misleading. In scientific nomenclature, the word “degeneracy” has calamitous proximity to the word “degeneration” used by pathologists and the shunned theory of degeneration once promoted by eugenicists. This article disentangles the concept of degeneracy from its close etymological siblings and offers a brief overview of the historical and contemporary understandings of degeneracy in science. Distinguishing the importance of degeneracy will hopefully allow systems theorists to more strategically operationally conceptualize the distributed intersecting networks that comprise complex living systems.


Degeneracy: Demystifying and destigmatizing a core concept in systems biology
Paul H. Mason

Complexity
Volume 20, Issue 3, pages 12–21, January/February 2015

http://dx.doi.org/10.1002/cplx.21534 

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

On optimal modularity for system construction

Modularity is a natural instrument and a ubiquitous practice for the engineering of human-made systems. However, modularization remains more of an art than a science; to the extent that the notion of optimal modularity is rarely used in engineering design. We prove that optimal modularity exists (at least for construction)—and is achieved through balanced modularization as structural symmetry in the distribution of the sizes of modules. We show that system construction cost is highly sensitive to both the number of modules and the modularization structure. However, this sensitivity has an inverse relationship with process capability and is minimal for highly capable construction processes with small process uncertainties. Conclusions are reached by a Bayesian estimation technique for a relatively simple construction model originally introduced by Herbert Simon for the hypothetical production of a linear structure, taking into account errors that may occur in the work associated with the production of the links between the nodes in the structure for varied numbers of modules.


On optimal modularity for system construction
Mahmoud Efatmaneshnik and Michael j Ryan
http://dx.doi.org/10.1002/cplx.21646

Complexity, Early View

more...
Saberes Sin Fronteras Ong's curator insight, January 23, 5:44 PM

#modularización #sistema

Scooped by Complexity Digest
Scoop.it!

Where Next for Microbiome Research?

The development of high-throughput sequencing technologies has transformed our capacity to investigate the composition and dynamics of the microbial communities that populate diverse habitats. Over the past decade, these advances have yielded an avalanche of metagenomic data. The current stage of “van Leeuwenhoek”–like cataloguing, as well as functional analyses, will likely accelerate as DNA and RNA sequencing, plus protein and metabolic profiling capacities and computational tools, continue to improve. However, it is time to consider: what’s next for microbiome research? The short pieces included here briefly consider the challenges and opportunities awaiting microbiome research.


Waldor MK, Tyson G, Borenstein E, Ochman H, Moeller A, et al. (2015) Where Next for Microbiome Research? PLoS Biol 13(1): e1002050. http://dx.doi.org/10.1371/journal.pbio.1002050 ;

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

NETWORKED MINDS: Where human evolution is heading

Having studied the technological and social forces shaping our societies, we are now turning to the evolutionary forces. Among the millions of species on earth, humans are truly unique. 
What is the recipe of our success? What makes us special? How do we decide? How will we further evolve? What will our role be, when algorithms, computers, machines, and robots are getting ever more powerful? How will our societies change?


http://futurict.blogspot.ie/2014/12/networked-minds-where-human-evolution.html

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Estimating Food Consumption and Poverty Indices with Mobile Phone Data

Recent studies have shown the value of mobile phone data to tackle problems related to economic development and humanitarian action. In this research, we assess the suitability of indicators derived from mobile phone data as a proxy for food security indicators. We compare the measures extracted from call detail records and airtime credit purchases to the results of a nationwide household survey conducted at the same time. Results show high correlations (> .8) between mobile phone data derived indicators and several relevant food security variables such as expenditure on food or vegetable consumption. This correspondence suggests that, in the future, proxies derived from mobile phone data could be used to provide valuable up-to-date operational information on food security throughout low and middle income countries.


Estimating Food Consumption and Poverty Indices with Mobile Phone Data
Adeline Decuyper, Alex Rutherford, Amit Wadhwa, Jean-Martin Bauer, Gautier Krings, Thoralf Gutierrez, Vincent D. Blondel, Miguel A. Luengo-Oroz

http://arxiv.org/abs/1412.2595

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

SOCIAL FORCES: Revealing the causes of success or disaster

SOCIAL FORCES: Revealing the causes of success or disaster | Papers | Scoop.it

We have seen that self-organizing systems can be very effective and efficient, but their macro-level behavior crucially depends on the interaction rules, interaction strength, and institutional settings. To get things right, it's important to understand the factors that drive the dynamics of the system. 


http://futurict.blogspot.ie/2014/12/social-forces-revealing-causes-of.html

more...
No comment yet.
Suggested by Matteo Chinazzi
Scoop.it!

Computational fact checking from knowledge networks

Traditional fact checking by expert journalists cannot keep up with the enormous volume of information that is now generated online. Computational fact checking may significantly enhance our ability to evaluate the veracity of dubious information. Here we show that the complexities of human fact checking can be approximated quite well by finding the shortest path between concept nodes under properly defined semantic proximity metrics on knowledge graphs. Framed as a network problem this approach is feasible with efficient computational techniques. We evaluate this approach by examining tens of thousands of claims related to history, entertainment, geography, and biographical information using a public knowledge graph extracted from Wikipedia. Statements independently known to be true consistently receive higher support via our method than do false ones. These findings represent a significant step toward scalable computational fact-checking methods that may one day mitigate the spread of harmful misinformation.


Computational fact checking from knowledge networks
Giovanni Luca Ciampaglia, Prashant Shiralkar, Luis M. Rocha, Johan Bollen, Filippo Menczer, Alessandro Flammini

http://arxiv.org/abs/1501.03471

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

The Fundamental Scale of Descriptions

The complexity of a system description is a function of the entropy of its symbolic description. Prior to computing the entropy of the system description, an observation scale has to be assumed. In natural language texts, typical scales are binary, characters, and words. However, considering languages as structures built around certain preconceived set of symbols, like words or characters, is only a presumption. This study depicts the notion of the Description Fundamental Scale as a set of symbols which serves to analyze the essence a language structure. The concept of Fundamental Scale is tested using English and MIDI music texts by means of an algorithm developed to search for a set of symbols, which minimizes the system observed entropy, and therefore best expresses the fundamental scale of the language employed. Test results show that it is possible to find the Fundamental Scale of some languages. The concept of Fundamental Scale, and the method for its determination, emerges as an interesting tool to facilitate the study of languages and complex systems.


The Fundamental Scale of Descriptions
Gerardo Febres

http://arxiv.org/abs/1412.8268

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Follow the Leader: Herding Behavior in Heterogeneous Populations

Here we study the emergence of spontaneous leadership in large populations. In standard models of opinion dynamics, herding behavior is only obeyed at the local scale due to the interaction of single agents with their neighbors; while at the global scale, such models are governed by purely diffusive processes. Surprisingly, in this paper we show that the combination of a strong separation of time scales within the population and a hierarchical organization of the influences of some agents on the others induces a phase transition between a purely diffusive phase, as in the standard case, and a herding phase where a fraction of the agents self-organize and lead the global opinion of the whole population.


Follow the Leader: Herding Behavior in Heterogeneous Populations
Guillem Mosquera-Donate, Marian Boguna

http://arxiv.org/abs/1412.7427

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Ricci Curvature of the Internet Topology

Analysis of Internet topologies has shown that the Internet topology has negative curvature, measured by Gromov's "thin triangle condition", which is tightly related to core congestion and route reliability. In this work we analyze the discrete Ricci curvature of the Internet, defined by Ollivier, Lin, etc. Ricci curvature measures whether local distances diverge or converge. It is a more local measure which allows us to understand the distribution of curvatures in the network. We show by various Internet data sets that the distribution of Ricci cuvature is spread out, suggesting the network topology to be non-homogenous. We also show that the Ricci curvature has interesting connections to both local measures such as node degree and clustering coefficient, global measures such as betweenness centrality and network connectivity, as well as auxilary attributes such as geographical distances. These observations add to the richness of geometric structures in complex network theory.


Ricci Curvature of the Internet Topology
Chien-Chun Ni, Yu-Yao Lin, Jie Gao, Xianfeng David Gu, Emil Saucan

http://arxiv.org/abs/1501.04138

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

The Big World of Nanothermodynamics

Nanothermodynamics extends standard thermodynamics to facilitate finite-size effects on the scale of nanometers. A key ingredient is Hill’s subdivision potential that accommodates the non-extensive energy of independent small systems, similar to how Gibbs’ chemical potential accommodates distinct particles. Nanothermodynamics is essential for characterizing the thermal equilibrium distribution of independently relaxing regions inside bulk samples, as is found for the primary response of most materials using various experimental techniques. The subdivision potential ensures strict adherence to the laws of thermodynamics: total energy is conserved by including an instantaneous contribution from the entropy of local configurations, and total entropy remains maximized by coupling to a thermal bath. A unique feature of nanothermodynamics is the completely-open nanocanonical ensemble. Another feature is that particles within each region become statistically indistinguishable, which avoids non-extensive entropy, and mimics quantum-mechanical behavior. Applied to mean-field theory, nanothermodynamics gives a heterogeneous distribution of regions that yields stretched-exponential relaxation and super-Arrhenius activation. Applied to Monte Carlo simulations, there is a nonlinear correction to Boltzmann’s factor that improves agreement between the Ising model and measured non-classical critical scaling in magnetic materials. Nanothermodynamics also provides a fundamental mechanism for the 1/f noise found in many materials.


The Big World of Nanothermodynamics
Ralph V. Chamberlin

Entropy 2015, 17(1), 52-73; http://dx.doi.org/10.3390/e17010052

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Cognition and behavior

An important question in the debate over embodied, enactive, and extended cognition has been what has been meant by “cognition”. What is this cognition that is supposed to be embodied, enactive, or extended? Rather than undertake a frontal assault on this question, however, this paper will take a different approach. In particular, we may ask how cognition is supposed to be related to behavior. First, we could ask whether cognition is supposed to be (a type of) behavior. Second, we could ask whether we should attempt to understand cognitive processes in terms of antecedently understood cognitive behaviors. This paper will survey some of the answers that have been (implicitly or explicitly) given in the embodied, enactive, and extended cognition literature, then suggest reasons to believe that we should answer both questions in the negative.


Cognition and behavior
Ken Aizawa

Synthese
January 2015

http://dx.doi.org/10.1007/s11229-014-0645-5 ;

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

All in the (bigger) family

The next time you are about to dig into a freshly steamed lobster for dinner, think “cockroach,” or better yet, “dragonfly.” A decade of genetic data and other evidence has persuaded most researchers that insects and crustaceans, long considered widely separated branches of the arthropod family tree, actually belong together. Now they are exploring the consequences of the revision, which traces insect ancestry to certain crustaceans. “When I think about traits in insects, I now have a context for where they came from,” says Jon Harrison, an evolutionary physiologist at Arizona State University, Tempe, who has spent 25 years investigating insect respiration. “It's a total change.”


All in the (bigger) family
Elizabeth Pennisi

Science 16 January 2015:
Vol. 347 no. 6219 pp. 220-221
http://dx.doi.org/10.1126/science.347.6219.220 

Complexity Digest's insight:

Let us hope that this revision promotes insectivorism: http://go.ted.com/suG 

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Development, information and social connectivity in Côte d'Ivoire

Understanding human socioeconomic development has proven to be one of the most difficult and persistent problems in science and policy. Traditional policy has often attempted to promote human development through infrastructure and the delivery of services, but the link between these engineered systems and the complexity of human socioeconomic behavior remains poorly understood. Recent research suggests that the key to socioeconomic progress lies in the development of processes whereby new information is created by individuals and organizations and embedded in the structure of social networks at a diverse set of scales, from nations to cities to firms. Here, we formalize these ideas in terms of network theory—namely the spatial network of mobile phone communications in Côte d’Ivoire--to show how incipient socioeconomic connectivity may constitute a general obstacle to development. Inspired by recent progress in the theory of cities as complex systems, we then propose a set of tests for these theories using telecommunications network data and describe how telecommunication services may generally help promote socioeconomic development.


Development, information and social connectivity in Côte d’Ivoire
Clio Andris and Luis MA Bettencourt

Infrastructure Complexity 2014, 1:1  http://dx.doi.org/10.1186/s40551-014-0001-4

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Human-Data Interaction: The Human Face of the Data-Driven Society

The increasing generation and collection of personal data has created a complex ecosystem, often collaborative but sometimes combative, around companies and individuals engaging in the use of these data. We propose that the interactions between these agents warrants a new topic of study: Human-Data Interaction (HDI). In this paper we discuss how HDI sits at the intersection of various disciplines, including computer science, statistics, sociology, psychology and behavioural economics. We expose the challenges that HDI raises, organised into three core themes of legibility, agency and negotiability, and we present the HDI agenda to open up a dialogue amongst interested parties in the personal and big data ecosystems.


Human-Data Interaction: The Human Face of the Data-Driven Society
Richard Mortier, Hamed Haddadi, Tristan Henderson, Derek McAuley, Jon Crowcroft

http://arxiv.org/abs/1412.6159

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

A Unifying Theory for Scaling Laws of Human Populations

The spatial distribution of people exhibits clustering across a wide range of scales, from household to continental  scales. Empirical data indicates simple power-law scalings for the size distribution of cities (known as Zipf's law), the geographic distribution of friends, and the population density fluctuations as a function of scale. We derive a simple statistical model that explains all of these scaling laws based on a single unifying principle involving the random spatial growth of clusters of people on all scales. The model makes important new predictions for the spread of diseases and other social phenomena.


A Unifying Theory for Scaling Laws of Human Populations
Henry W. Lin, Abraham Loeb

http://arxiv.org/abs/1501.00738

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Why It’s Good To Be Wrong

Why It’s Good To Be Wrong | Papers | Scoop.it

That human beings can be mistaken in anything they think or do is a proposition known as fallibilism. Stated abstractly like that, it is seldom contradicted. Yet few people have ever seriously believed it, either.

That our senses often fail us is a truism; and our self-critical culture has long ago made us familiar with the fact that we can make mistakes of reasoning too. But the type of fallibility that I want to discuss here would be all-pervasive even if our senses were as sharp as the Hubble Telescope and our minds were as logical as a computer. It arises from the way in which our ideas about reality connect with reality itself—how, in other words, we can create knowledge, and how we can fail to.

more...
No comment yet.
Suggested by Matteo Chinazzi
Scoop.it!

The multilayer temporal network of public transport in Great Britain

The multilayer temporal network of public transport in Great Britain | Papers | Scoop.it

Despite the widespread availability of information concerning public transport coming from different sources, it is extremely hard to have a complete picture, in particular at a national scale. Here, we integrate timetable data obtained from the United Kingdom open-data program together with timetables of domestic flights, and obtain a comprehensive snapshot of the temporal characteristics of the whole UK public transport system for a week in October 2010. In order to focus on multi-modal aspects of the system, we use a coarse graining procedure and define explicitly the coupling between different transport modes such as connections at airports, ferry docks, rail, metro, coach and bus stations. The resulting weighted, directed, temporal and multilayer network is provided in simple, commonly used formats, ensuring easy access and the possibility of a straightforward use of old or specifically developed methods on this new and extensive dataset.


The multilayer temporal network of public transport in Great Britain
Riccardo Gallotti & Marc Barthelemy

Scientific Data, Published online: 6 January 2015; | http://dx.doi.org/10.1038/sdata.2014.56

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

A Rosetta Stone for Nature’s Benefits to People

After a long incubation period, the Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES) is now underway. Underpinning all its activities is the IPBES Conceptual Framework (CF), a simplified model of the interactions between nature and people. Drawing on the legacy of previous large-scale environmental assessments, the CF goes further in explicitly embracing different disciplines and knowledge systems (including indigenous and local knowledge) in the co-construction of assessments of the state of the world’s biodiversity and the benefits it provides to humans. The CF can be thought of as a kind of “Rosetta Stone” that highlights commonalities between diverse value sets and seeks to facilitate crossdisciplinary and crosscultural understanding. We argue that the CF will contribute to the increasing trend towards interdisciplinarity in understanding and managing the environment. Rather than displacing disciplinary science, however, we believe that the CF will provide new contexts of discovery and policy applications for it.


Díaz S, Demissew S, Joly C, Lonsdale WM, Larigauderie A (2015) A Rosetta Stone for Nature’s Benefits to People. PLoS Biol 13(1): e1002040. http://dx.doi.org/10.1371/journal.pbio.1002040


Complexity Digest's insight:

See Also http://www.ipbes.net/

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Characterizing the Google Books corpus: Strong limits to inferences of socio-cultural and linguistic evolution

It is tempting to treat frequency trends from Google Books data sets as indicators for the true popularity of various words and phrases. Doing so allows us to draw novel conclusions about the evolution of public perception of a given topic, such as time and gender. However, sampling published works by availability and ease of digitization leads to several important effects. One of these is the surprising ability of a single prolific author to noticeably insert new phrases into a language. A greater effect arises from scientific texts, which have become increasingly prolific in the last several decades and are heavily sampled in the corpus. The result is a surge of phrases typical to academic articles but less common in general, such as references to time in the form of citations. Here, we highlight these dynamics by examining and comparing major contributions to the statistical divergence of English data sets between decades in the period 1800--2000. We find that only the English Fiction data set from the second version of the corpus is not heavily affected by professional texts, in clear contrast to the first version of the fiction data set and both unfiltered English data sets. Our findings emphasize the need to fully characterize the dynamics of the Google Books corpus before using these data sets to draw broad conclusions about cultural and linguistic evolution.


Characterizing the Google Books corpus: Strong limits to inferences of socio-cultural and linguistic evolution
Eitan Adam Pechenick, Christopher M. Danforth, Peter Sheridan Dodds

http://arxiv.org/abs/1501.00960

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Defensive complexity in antagonistic coevolution

One strategy for winning a coevolutionary struggle is to evolve rapidly. Most of the literature on host-pathogen coevolution focuses on this phenomenon, and looks for consequent evidence of coevolutionary arms races. An alternative strategy, less often considered in the literature, is to deter rapid evolutionary change by the opponent. To study how this can be done, we construct an evolutionary game between a controller that must process information, and an adversary that can tamper with this information processing. In this game, a species can foil its antagonist by processing information in a way that is hard for the antagonist to manipulate. We show that the structure of the information processing system induces a fitness landscape on which the adversary population evolves, and that complex processing logic is required to make that landscape rugged. Drawing on the rich literature concerning rates of evolution on rugged landscapes, we show how a species can slow adaptive evolution in the adversary population. We suggest that this type of defensive complexity on the part of the vertebrate adaptive immune system may be an important element of coevolutionary dynamics between pathogens and their vertebrate hosts.


Defensive complexity in antagonistic coevolution
Erick Chastain, Rustom Antia, Carl T. Bergstrom

http://arxiv.org/abs/1203.4601

more...
No comment yet.