The global spread of epidemics, rumors, opinions, and innovations are complex, network-driven dynamic processes. The combined multiscale nature and intrinsic heterogeneity of the underlying networks make it difficult to develop an intuitive understanding of these processes, to distinguish relevant from peripheral factors, to predict their time course, and to locate their origin. However, we show that complex spatiotemporal patterns can be reduced to surprisingly simple, homogeneous wave propagation patterns, if conventional geographic distance is replaced by a probabilistically motivated effective distance. In the context of global, air-traffic–mediated epidemics, we show that effective distance reliably predicts disease arrival times. Even if epidemiological parameters are unknown, the method can still deliver relative arrival times. The approach can also identify the spatial origin of spreading processes and successfully be applied to data of the worldwide 2009 H1N1 influenza pandemic and 2003 SARS epidemic.
The Hidden Geometry of Complex, Network-Driven Contagion Phenomena Dirk Brockmann, Dirk Helbing
When people first hear about quantum computers, a common response is “where and when can I get one?” But that’s the wrong question, and not just because you’ll be disappointed with the answer. Quantum computers are often said to promise faster, bigger, more multi-layered computation—but they are not, and might never be, an upgrade of your laptop. They’re just not that sort of machine. So what are they, and why do we want them?
Detecting overlapping communities is essential to analyzing and exploring natural networks such as social networks, biological networks, and citation networks. However, most existing approaches do not scale to the size of networks that we regularly observe in the real world. In this paper, we develop a scalable approach to community detection that discovers overlapping communities in massive real-world networks. Our approach is based on a Bayesian model of networks that allows nodes to participate in multiple communities, and a corresponding algorithm that naturally interleaves subsampling from the network and updating an estimate of its communities. We demonstrate how we can discover the hidden community structure of several real-world networks, including 3.7 million US patents, 575,000 physics articles from the arXiv preprint server, and 875,000 connected Web pages from the Internet. Furthermore, we demonstrate on large simulated networks that our algorithm accurately discovers the true community structure. This paper opens the door to using sophisticated statistical models to analyze massive networks.
In evolutionary game dynamics in finite populations, selection intensity plays a key role in determining the impact of the game on reproductive success. Weak selection is often employed to obtain analytical results in evolutionary game theory. We investigate the validity of weak selection predictions for stronger intensities of selection. We prove that in general qualitative results obtained under weak selection fail to extend even to moderate selection strengths for games with either more than two strategies or more than two players. In particular, we find that even in pairwise interactions qualitative changes with changing selection intensity arise almost certainly in the case of a large number of strategies.
The financial crisis clearly illustrated the importance of characterizing the level of ‘systemic’ risk associated with an entire credit network, rather than with single institutions. However, the interplay between financial distress and topological changes is still poorly understood. Here we analyze the quarterly interbank exposures among Dutch banks over the period 1998–2008, ending with the crisis. After controlling for the link density, many topological properties display an abrupt change in 2008, providing a clear – but unpredictable – signature of the crisis. By contrast, if the heterogeneity of banks' connectivity is controlled for, the same properties show a gradual transition to the crisis, starting in 2005 and preceded by an even earlier period during which anomalous debt loops could have led to the underestimation of counter-party risk. These early-warning signals are undetectable if the network is reconstructed from partial bank-specific data, as routinely done. We discuss important implications for bank regulatory policies.
Hod Lipson, an engineer who runs the Creative Machines Lab at Cornell University in Ithaca, New York, draws an analogy with the history of the computer. In the 1950s, computers were rare, expensive and owned mostly by large universities and businesses, and they required expert users to perform even relatively simple tasks. By the 1970s and 1980s, personal computers had emerged, and enthusiasts were assembling them from kits and writing their own software. Now practically everyone carries a powerful computer in their pocket and can do all manner of tasks with no programming knowledge. In the case of 3D printing, Lipson says, the transition from rare, limited and cumbersome to common, versatile and easy-to-use is happening quickly. “I used to say we're in the 1975 of printers, and now we're in the mid-80s already,” he says. “We're still at the point where most people are not comfortable using 3D printers and design tools. Those who are can make things a lot easier for themselves and get an edge.”
We propose a model that explains the reliable emergence of power laws (e.g., Zipf’s law) during the development of different human languages. The model incorporates the principle of least effort in communications, minimizing a combination of the information-theoretic communication inefficiency and direct signal cost. We prove a general relationship, for all optimal languages, between the signal cost distribution and the resulting distribution of signals. Zipf’s law then emerges for logarithmic signal cost distributions, which is the cost distribution expected for words constructed from letters or phonemes.
Zipf’s Law: Balancing Signal Usage Cost and Communication Efficiency Christoph Salge, Nihat Ay, Daniel Polani, Mikhail Prokopenko
We investigate the failure mechanisms of load sharing complex systems. The system is composed of multiple nodes or components whose failures are determined based on the interaction of their respective strengths and loads (or capacity and demand respectively) as well as the ability of a component to share its load with its neighbors when needed. We focus on two distinct mechanisms to model the interaction between components' strengths and loads. The failure mechanisms of these two models demonstrate temporal scaling phenomena, phase transitions and multiple distinct failure modes excited by extremal dynamics. For critical ranges of parameters the models demonstrate power law and exponential failure patterns. We identify the similarities and differences between the two mechanisms and the implications of our results to the failure mechanisms of complex systems in the real world.
Failure mechanisms of load sharing complex systems Shahnewaz Siddique, Vitali Volovoi
Since the year 2000, psychological research has tied gratitude to a host of benefits: the tendency to feel more hopeful and optimistic about one’s own future, better coping mechanisms for dealing with adversity and stress, less instances of depression and addiction, exercising more, and even sleeping better. The degree to which we’re grateful “can explain more variance in life satisfaction than such traits as love, forgiveness, social intelligence, and humor,” sings one recent paper. “Gratitude is strongly related to all aspects of well-being,” declares another.
We study a simple voter model with two competing parties. In particular, we represent the case of political elections, where people can choose to support one of the two competitors or to remain neutral. People interact in a social network and their opinion depends on those of their neighbors. Therefore, people may change opinion over time, i.e., they can support one competitor or none. The two competitors try to gain the people's consensus by interacting with their neighbors and also with other people. In particular, competitors define temporal connections, following a strategy, to interact with people they do not know, i.e., with all the people that are not their neighbors. We analyze the proposed model to investigate which network strategies are more advantageous, for the competitors, in order to gain the popular consensus. As result, we found that the best network strategy depends on the topology of the social network. Finally, we investigate how the charisma of competitors affects the outcomes of the proposed model.
Network Strategies in the Voter Model Marco Alberto Javarone
We argue for a decentralized approach where the participants in the social network keep their own data and perform computations in a distributed fashion without any central authority. A natural question that arises then is what distributed computations can be performed in such a decentralized setting. Our primary contribution is to lay the ground for expressing the question precisely. We refer to the underlying problem as the S3 problem: Scalable Secure computing in a Social network. Whereas scalability characterizes the spatial, computational and message complexity of the computation, the secure aspect of S3 encompasses accuracy and privacy.(...)
Computing in Social Networks
Andrei Giurgiu, Rachid Guerraoui, Kévin Huguenin, Anne-Marie Kermarrec
The discovery of the Archaea and the proposal of the three-domains ‘universal’ tree, based on ribosomal RNA and core genes mainly involved in protein translation, catalysed new ideas for cellular evolution and eukaryotic origins. However, accumulating evidence suggests that the three-domains tree may be incorrect: evolutionary trees made using newer methods place eukaryotic core genes within the Archaea, supporting hypotheses in which an archaeon participated in eukaryotic origins by founding the host lineage for the mitochondrial endosymbiont. These results provide support for only two primary domains of life—Archaea and Bacteria—because eukaryotes arose through partnership between them.
An archaeal origin of eukaryotes supports only two primary domains of life Tom A. Williams, Peter G. Foster, Cymon J. Cox & T. Martin Embley
The relation between flow and density, also known as the fundamental diagram, is an essential quantitative characteristic to describe the efficiency of traffic systems. We have performed experiments with single-file motion of bicycles and compare the results with previous studies for car and pedestrian motion in similar setups. In the space-time diagrams we observe three different states of motion (free flow state, jammed state and stop-and-go waves) in all these systems. Despite of their obvious differences they are described by a universal fundamental diagram after proper rescaling of space and time which takes into account the size and free velocity of the three kinds of agents. This indicates that the similarities between the systems go deeper than expected.
Universal flow-density relation of single-file bicycle, pedestrian and car motion Jun Zhang, Wolfgang Mehner, Stefan Holl, Maik Boltes, Erik Andresen, Andreas Schadschneider, Armin Seyfried
The extensive exploitation of marine resources by modern fisheries (...) has wide-ranging effects on marine ecosystems. Across the world's oceans, size-selective harvesting by commercial fisheries has been a key driving force behind changes in phenotypic traits such as body size and age at maturation (1–3). These changes have altered the trophic structure of the affected ecosystems, disturbed predatorprey relationships, and modified trophic cascade dynamics (3, 4). Phenotypic changes can involve both ecological and evolutionary reactions to the effect of fishing, and there has been much debate about the relative roles of these reactions. This is important because genetic changes could result in long-term reductions in catches. Recent work has provided evidence for fisheries-induced evolutionary changes, with important implications for the sustainability of fisheries.
How Fisheries Affect Evolution Andrea Belgrano, Charles W. Fowler
Does the availability of instant reference checking and “find more like this” research on the Internet change the standards by which academics should feel “obligated” to cite the work of others? Is the deliberate refusal to look for the existence of parallel work by others an ethical lapse or merely negligence? At a minimum, the Dutch standard of Slodderwetenschap (sloppy science) is clearly at work. At a maximum so is plagiarism. In between sits the process to be labeled as ‘plagiarism by negligence’. This article seeks to expose the intellectual folly of allowing such a plagiarism to be tolerated by the academy through a discussion of the cases of Terrence Deacon and Stephen Wolfram.
Subliminal Influence or Plagiarism by Negligence? The Slodderwetenschap of Ignoring the Internet
The new science of complex systems will be at the heart of the future of the Worldwide Knowledge Society. It is providing radical new ways of understanding the physical, biological, ecological, and techno-social universe. Complex Systems are open, value-laden, multi-level, multi-component, reconfigurable systems of systems, situated in turbulent, unstable, and changing environments. They evolve, adapt and transform through internal and external dynamic interactions. They are the source of very difficult scientific challenges for observing, understanding, reconstructing and predicting their multi-scale dynamics. The challenges posed by the multi-scale modelling of both natural and artificial adaptive complex systems can only be met with radically new collective strategies for research and teaching (...)
The hallmark of deterministic chaos is that it creates information—the rate being given by the Kolmogorov-Sinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system’s intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created information—the ephemeral information—is forgotten and a portion—the bound information—is remembered. The bound information is a new kind of intrinsic computation that differs fundamen- tally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute.
Chaos Forgets and Remembers: Measuring Information Creation, Destruction, and Storage Ryan G. James, Korana Burke, James P. Crutchfield
Networks in the real world do not exist as isolated entities, but they are often part of more complicated structures composed of many interconnected network layers. Recent studies have shown that such mutual dependence makes real networked systems exposed to potentially catastrophic failures, and thus there is a urgent necessity to better understand the mechanisms at the basis of this fragility. The theoretical approach to this problem is based on the study of the nature of the phase transitions associated to critical phenomena running on interconnected networks. In particular, it has been shown that many critical phenomena of continuous nature in isolated networks become instead discontinuous, and thus catastrophic, in multi-layer networks when the strength of the interconnections is sufficiently large. In this paper, we show that four main ingredients determine the critical features of a random interconnected network: the strength of the interconnections, the first two moments of the degree distribution of the entire network, and the correlation between intra- and inter-layer degrees. Different mixtures of these ingredients change the location of the critical points, and lead to the emergence a very rich scenario where phase transitions can be either discontinuous or continuous and different regimes can disappear or even coexist.
Coexistence of critical regimes in interconnected networks Filippo Radicchi
That heat always flows from a hot place to a cold one is a manifestation of the second law of thermodynamics. It is such a part of our everyday experience that to propose the opposite would seem ridiculous. But, that was, in principle, what the Scottish physicist James Clerk Maxwell—one of the giants of physics—suggested in 1867 with his idea of Maxwell’s demon. Maxwell’s thesis was that the second law of thermodynamics has only a statistical certainty. In other words, with sufficiently detailed information about the microscopic motions of individual atoms and molecules, we might be able to separate the fast-moving (“hot”) ones from the slow-moving (“cold”) ones and induce the heat to flow from cold to hot, in apparent contradiction with the second law of thermodynamics. This apparent conundrum, which has caught the attention of many well-known physicists from Lord Kelvin onward, is typically interpreted as the result of an external observer tinkering with the actual microscopic dynamics of a thermodynamic system—the “demon” still alien to many. In this paper, we take a different approach. To the traditional thermodynamic analyses that employ heat and work reservoirs, we add an information reservoir in place of an external observer. The information reservoir captures the physical consequences of Maxwell’s demon, or more precisely, its memory. We then use a single time-independent Hamiltonian to describe a collection of thermodynamic elements, including a device, one or more heat baths, a work reservoir, and an information reservoir (like, for instance, a stream of bits). As a main result, we generalize several classic formulations of the second law of thermodynamics, such as the Kelvin-Planck, Clausius, and Carnot statements, to situations involving information processing. Our paper provides a general conceptual framework for further development and explorations in the cross-disciplinary field of thermodynamic theories of information processing.
Information Processing and the Second Law of Thermodynamics: An Inclusive, Hamiltonian Approach Sebastian Deffner and Christopher Jarzynski
The massive amounts of geolocation data collected from mobile phone records has sparked an ongoing effort to understand and predict the mobility patterns of human beings. In this work, we study the extent to which social phenomena are reflected in mobile phone data, focusing in particular in the cases of urban commute and major sports events. We illustrate how these events are reflected in the data, and show how information about the events can be used to improve predictability in a simple model for a mobile phone user's location.
Human Mobility and Predictability enriched by Social Phenomena Information Nicolas Ponieman, Alejo Salles, Carlos Sarraute
We compared entropy for texts written in natural languages (English, Spanish) and artificial languages (computer software) based on a simple expression for the entropy as a function of message length and specific word diversity. Code text written in artificial languages showed higher entropy than text of similar length expressed in natural languages. Spanish texts exhibit more symbolic diversity than English ones. Results showed that algorithms based on complexity measures differentiate artificial from natural languages, and that text analysis based on complexity measures allows the unveiling of important aspects of their nature. We propose specific expressions to examine entropy related aspects of tests and estimate the values of entropy, emergence, self-organization and complexity based on specific diversity and message length.
Complexity measurement of natural and artificial languages Gerardo Febres, Klaus Jaffe, Carlos Gershenson
Communication depends on reliability. Yet, the existence of stable honest signalling presents an evolutionary puzzle. Why should animals signal honestly in the face of a conflict of interest? This study [...] provides, to our knowledge, the first experimental evidence showing honesty persists when costs are high and disappears when costs are low.