Is it possible to predict how individuals will perform before the teamwork begins? Research by former cyclist Hugh Trenchard and others suggests that the mathematics of pelotons– the groups and bunches that cyclists form during a race – could be key to understanding how cyclists behave as a collective entity. While these collective dynamics may not tell us who will win the Tour de France, they do have broader applications to a variety of other biological systems. Here, Trenchard tells us more about his research, and how it might even provide some clues to the origin of life.
The morphology of urban agglomeration is studied here in the context of information exchange between different spatio-temporal scales. Urban migration to and from cities is characterised as non-random and following non-random pathways. Cities are multidimensional non-linear phenomena, so understanding the relationships and connectivity between scales is important in determining how the interplay of local/regional urban policies may affect the distribution of urban settlements. In order to quantify these relationships, we follow an information theoretic approach using the concept of Transfer Entropy. Our analysis is based on a stochastic urban fractal model, which mimics urban growing settlements and migration waves. The results indicate how different policies could affect urban morphology in terms of the information generated across geographical scales.
In studying fundamental physical limits and properties of computational processes, one is faced with the challenges of interpreting primitive information-processing functions through well-defined information-theoretic as well as thermodynamic quantities. In particular, transfer entropy, characterizing the function of computational transmission and its predictability, is known to peak near critical regimes. We focus on a thermodynamic interpretation of transfer entropy aiming to explain the underlying critical behavior by associating information flows intrinsic to computational transmission with particular physical fluxes. Specifically, in isothermal systems near thermodynamic equilibrium, the gradient of the average transfer entropy is shown to be dynamically related to Fisher information and the curvature of system's entropy. This relationship explicitly connects the predictability, sensitivity, and uncertainty of computational processes intrinsic to complex systems and allows us to consider thermodynamic interpretations of several important extreme cases and trade-offs.
Information thermodynamics of near-equilibrium computation Mikhail Prokopenko and Itai Einav Phys. Rev. E 91, 062143
Classic life history models are often based on optimization algorithms, focusing on the adaptation of survival and reproduction to the environment, while neglecting frequency dependent interactions in the population. Evolutionary game theory, on the other hand, studies frequency dependent strategy interactions, but usually omits life history and the demographic structure of the population. Here we show how an integration of both aspects can substantially alter the underlying evolutionary dynamics.
Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth’s surface; however, in modern contagions long-range edges—for example, due to airline transportation or communication media—allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct ‘contagion maps’ that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.
In his 1942 short story 'Runaround', science-fiction writer Isaac Asimov introduced the Three Laws of Robotics — engineering safeguards and built-in ethical principles that he would go on to use in dozens of stories and novels. They were: 1) A robot may not injure a human being or, through inaction, allow a human being to come to harm; 2) A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law; and 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. Fittingly, 'Runaround' is set in 2015. Real-life roboticists are citing Asimov's laws a lot these days: their creations are becoming autonomous enough to need that kind of guidance. In May, a panel talk on driverless cars at the Brookings Institution, a think tank in Washington DC, turned into a discussion about how autonomous vehicles would behave in a crisis. What if a vehicle's efforts to save its own passengers by, say, slamming on the brakes risked a pile-up with the vehicles behind it? Or what if an autonomous car swerved to avoid a child, but risked hitting someone else nearby?
It is common practice to partition complex workflows into separate channels in order to speed up their completion times. When this is done within a distributed environment, unavoidable fluctuations make individual realizations depart from the expected average gains. We present a method for breaking any complex workflow into several workloads in such a way that once their outputs are joined, their full completion takes less time and exhibit smaller variance than when running in only one channel. We demonstrate the effectiveness of this method in two different scenarios; the optimization of a convex function and the transmission of a large computer file over the Internet.
Partitioning Uncertain Workflows Bernardo A. Huberman, Freddy C. Chua
Cascades in multiplex financial networks with debts of different seniority
The seniority of debt, which determines the order in which a bankrupt institution repays its debts, is an important and sometimes contentious feature of financial crises, yet its impact on systemwide stability is not well understood. We capture seniority of debt in a multiplex network, a graph of nodes connected by multiple types of edges. Here an edge between banks denotes a debt contract of a certain level of seniority. Next we study cascading default. There exist multiple kinds of bankruptcy, indexed by the highest level of seniority at which a bank cannot repay all its debts. Self-interested banks would prefer that all their loans be made at the most senior level. However, mixing debts of different seniority levels makes the system more stable in that it shrinks the set of network densities for which bankruptcies spread widely. We compute the optimal ratio of senior to junior debts, which we call the optimal seniority ratio, for two uncorrelated Erdős-Rényi networks. If institutions erode their buffer against insolvency, then this optimal seniority ratio rises; in other words, if default thresholds fall, then more loans should be senior. We generalize the analytical results to arbitrarily many levels of seniority and to heavy-tailed degree distributions.
We introduce a new kind of percolation on finite graphs called jigsaw percolation. This model attempts to capture networks of people who innovate by merging ideas and who solve problems by piecing together solutions. Each person in a social network has a unique piece of a jigsaw puzzle. Acquainted people with compatible puzzle pieces merge their puzzle pieces. More generally, groups of people with merged puzzle pieces merge if the groups know one another and have a pair of compatible puzzle pieces. The social network solves the puzzle if it eventually merges all the puzzle pieces. For an Erdős–Rényi social network with n vertices and edge probability p_n, we define the critical value p_c(n) for a connected puzzle graph to be the p_n for which the chance of solving the puzzle equals 1/2. We prove that for the n-cycle (ring) puzzle, p_c(n)=Θ(1/log n), and for an arbitrary connected puzzle graph with bounded maximum degree, p_c(n)=O(1/log n) and ω(1/n^b)for any b>0. Surprisingly, with probability tending to 1 as the network size increases to infinity, social networks with a power-law degree distribution cannot solve any bounded-degree puzzle. This model suggests a mechanism for recent empirical claims that innovation increases with social density, and it might begin to show what social networks stifle creativity and what networks collectively innovate.
Brummitt, Charles D.; Chatterjee, Shirshendu; Dey, Partha S.; Sivakoff, David. Jigsaw percolation: What social networks can collaboratively solve a puzzle?. Ann. Appl. Probab. 25 (2015), no. 4, 2013--2038. doi:10.1214/14-AAP1041.http://projecteuclid.org/euclid.aoap/1432212435.
In synthetic ecology, a nascent offshoot of synthetic biology, scientists aim to design and construct microbial communities with desirable properties. Such mixed populations of microorganisms can simultaneously perform otherwise incompatible functions (1). Compared with individual organisms, they can also better resist losses in function as a result of environmental perturbation or invasion by other species (2). Synthetic ecology may thus be a promising approach for developing robust, stable biotechnological processes, such as the conversion of cellulosic biomass to biofuels (3). However, achieving this will require detailed knowledge of the principles that guide the structure and function of microbial communities (see the image).
Ecological communities by design James K. Fredrickson
Antarctic biodiversity is much more extensive, ecologically diverse and biogeographically structured than previously thought. Understanding of how this diversity is distributed in marine and terrestrial systems, the mechanisms underlying its spatial variation, and the significance of the microbiota is growing rapidly. Broadly recognizable drivers of diversity variation include energy availability and historical refugia. The impacts of local human activities and global environmental change nonetheless pose challenges to the current and future understanding of Antarctic biodiversity. Life in the Antarctic and the Southern Ocean is surprisingly rich, and as much at risk from environmental change as it is elsewhere.
The changing form of Antarctic biodiversity • Steven L. Chown, Andrew Clarke, Ceridwen I. Fraser, S. Craig Cary, Katherine L. Moon & Melodie A. McGeoch
Geoffrey Hinton has a news bulletin for you: You’re not conscious. OK, you’re conscious as opposed to being unconscious – such as when you fall asleep at night, or when you get knocked out during a boxing match or when a doctor administers a general anesthetic before surgery. But you don’t have some intangible mental quality that worms or daffodils – or toasters, for that matter – lack.
Coevolutionary interactions are thought to have spurred the evolution of key innovations and driven the diversification of much of life on Earth. However, the genetic and evolutionary basis of the innovations that facilitate such interactions remains poorly understood. We examined the coevolutionary interactions between plants (Brassicales) and butterflies (Pieridae), and uncovered evidence for an escalating evolutionary arms-race. Although gradual changes in trait complexity appear to have been facilitated by allelic turnover, key innovations are associated with gene and genome duplications. Furthermore, we show that the origins of both chemical defenses and of molecular counter adaptations were associated with shifts in diversification rates during the arms-race. These findings provide an important connection between the origins of biodiversity, coevolution, and the role of gene and genome duplications as a substrate for novel traits.
After growing up together, and mostly growing apart in the second half of the 20th century, the fields of artificial intelligence (AI), cognitive science, and neuroscience are reconverging on a shared view of the computational foundations of intelligence that promotes valuable cross-disciplinary exchanges on questions, methods, and results. We chart advances over the past several decades that address challenges of perception and action under uncertainty through the lens of computation. Advances include the development of representations and inferential procedures for large-scale probabilistic inference and machinery for enabling reflection and decisions about tradeoffs in effort, precision, and timeliness of computations. These tools are deployed toward the goal of computational rationality: identifying decisions with highest expected utility, while taking into consideration the costs of computation in complex real-world problems in which most relevant calculations can only be approximated. We highlight key concepts with examples that show the potential for interchange between computer science, cognitive science, and neuroscience.
Computational rationality: A converging paradigm for intelligence in brains, minds, and machines Samuel J. Gershman, Eric J. Horvitz, Joshua B. Tenenbaum
Many models proposed to study the evolution of collective action rely on a formalism that represents social interactions as n-player games between individuals adopting discrete actions such as cooperate and defect. Despite the importance of spatial structure in biological collective action, the analysis of n-player games games in spatially structured populations has so far proved elusive. We address this problem by considering mixed strategies and by integrating discrete-action n-player games into the direct fitness approach of social evolution theory.
This study characterized double-gene deletion mutants of E. coli with the aim of investigating the sub-optimal physiology of the mutants and the possible roles of latent reactions. It considered, in particular, the effect of the order of the gene deletions on the growth rates and substrate uptake rates of the double-gene deletion mutants. The results indicate that the order in which genes are deleted determines the phenotype of the mutants during the sub-optimal growth phase. The mechanism behind the difference between the observed phenotypes was elucidated using transcriptomic analysis and constraint-based modeling of the mutants.
The high population density in cities confers many advantages, including improved social interaction and information exchange. However, it is often argued that urban living comes at the expense of reducing happiness. The goal of this research is to shed light on the relationship between urban communication and urban happiness. We analyze geo-located social media posts (tweets) within a major urban center (Milan) to produce a detailed spatial map of urban sentiments. We combine this data with high-resolution mobile communication intensity data among different urban areas. Our results reveal that happy (respectively unhappy) areas preferentially communicate with other areas of their type. This observation constitutes evidence of homophilous communities at the scale of an entire city (Milan), and has implications on interventions that aim to improve urban well-being.
Misery loves company: happiness and communication in the city Alshamsi A, Awad E, Almehrezi M, Babushkin V, Chang P, Shoroye Z, Tóth A, Rahwan I EPJ Data Science 2015, 4 :7
The detection and characterization of self-organized criticality (SOC), in both real and simulated data, has undergone many significant revisions over the past 25 years. The explosive advances in the many numerical methods available for detecting, discriminating, and ultimately testing, SOC have played a critical role in developing our understanding of how systems experience and exhibit SOC. In this article, methods of detecting SOC are reviewed; from correlations to complexity to critical quantities. A description of the basic autocorrelation method leads into a detailed analysis of application-oriented methods developed in the last 25 years. In the second half of this manuscript space-based, time-based and spatial-temporal methods are reviewed and the prevalence of power laws in nature is described, with an emphasis on event detection and characterization. The search for numerical methods to clearly and unambiguously detect SOC in data often leads us outside the comfort zone of our own disciplines - the answers to these questions are often obtained by studying the advances made in other fields of study. In addition, numerical detection methods often provide the optimum link between simulations and experiments in scientific research. We seek to explore this boundary where the rubber meets the road, to review this expanding field of research of numerical detection of SOC systems over the past 25 years, and to iterate forwards so as to provide some foresight and guidance into developing breakthroughs in this subject over the next quarter of a century.
25 Years of Self-Organized Criticality: Numerical Detection Methods R.T. James McAteer, Markus J. Aschwanden, Michaila Dimitropoulou, Manolis K. Georgoulis, Gunnar Pruessner, Laura Morales, Jack Ireland, Valentyna Abramenko
Conversational modeling is an important task in natural language understanding and machine intelligence. Although previous approaches exist, they are often restricted to specific domains (e.g., booking an airline ticket) and require hand-crafted rules. In this paper, we present a simple approach for this task which uses the recently proposed sequence to sequence framework. Our model converses by predicting the next sentence given the previous sentence or sentences in a conversation. The strength of our model is that it can be trained end-to-end and thus requires much fewer hand-crafted rules. We find that this straightforward model can generate simple conversations given a large conversational training dataset. Our preliminary suggest that, despite optimizing the wrong objective function, the model is able to extract knowledge from both a domain specific dataset, and from a large, noisy, and general domain dataset of movie subtitles. On a domain-specific IT helpdesk dataset, the model can find a solution to a technical problem via conversations. On a noisy open-domain movie transcript dataset, the model can perform simple forms of common sense reasoning. As expected, we also find that the lack of consistency is a common failure mode of our model.
A Neural Conversational Model Oriol Vinyals, Quoc Le
The oft-repeated claim that Earth’s biota is entering a sixth “mass extinction” depends on clearly demonstrating that current extinction rates are far above the “background” rates prevailing between the five previous mass extinctions. Earlier estimates of extinction rates have been criticized for using assumptions that might overestimate the severity of the extinction crisis. We assess, using extremely conservative assumptions, whether human activities are causing a mass extinction. First, we use a recent estimate of a background rate of 2 mammal extinctions per 10,000 species per 100 years (that is, 2 E/MSY), which is twice as high as widely used previous estimates. We then compare this rate with the current rate of mammal and vertebrate extinctions. The latter is conservatively low because listing a species as extinct requires meeting stringent criteria. Even under our assumptions, which would tend to minimize evidence of an incipient mass extinction, the average rate of vertebrate species loss over the last century is up to 100 times higher than the background rate. Under the 2 E/MSY background rate, the number of species that have gone extinct in the last century would have taken, depending on the vertebrate taxon, between 800 and 10,000 years to disappear. These estimates reveal an exceptionally rapid loss of biodiversity over the last few centuries, indicating that a sixth mass extinction is already under way. Averting a dramatic decay of biodiversity and the subsequent loss of ecosystem services is still possible through intensified conservation efforts, but that window of opportunity is rapidly closing.
Accelerated modern human–induced species losses: Entering the sixth mass extinction Gerardo Ceballos, Paul R. Ehrlich, Anthony D. Barnosky, Andrés García, Robert M. Pringle and Todd M. Palmer
The origin of life can be understood mathematically to be the origin of information that can replicate. The likelihood that entropy spontaneously becomes information can be calculated from first principles, and depends exponentially on the amount of information that is necessary for replication. We do not know what the minimum amount of information for self-replication is because it must depend on the local chemistry, but we can study how this likelihood behaves in different known chemistries, and we can study ways in which this likelihood can be enhanced. Here we present evidence from numerical simulations (using the digital life chemistry "Avida") that using a biased probability distribution for the creation of monomers (the "biased typewriter") can exponentially increase the likelihood of spontaneous emergence of information from entropy. We show that this likelihood may depend on the length of the sequence that the information is embedded in, but in a non-trivial manner: there may be an optimum sequence length that maximizes the likelihood. We conclude that the likelihood of spontaneous emergence of self-replication is much more malleable than previously thought, and that the biased probability distributions of monomers that are the norm in biochemistry may significantly enhance these likelihoods
From Entropy to Information: Biased Typewriters and the Origin of Life Christoph Adami, Thomas LaBar
Big Data on electronic records of social interactions allow approaching human behaviour and sociality from a quantitative point of view with unforeseen statistical power. Mobile telephone Call Detail Records (CDRs), automatically collected by telecom operators for billing purposes, have proven especially fruitful for understanding one-to-one communication patterns as well as the dynamics of social networks that are reflected in such patterns. We present an overview of empirical results on the multi-scale dynamics of social dynamics and networks inferred from mobile telephone calls. We begin with the shortest timescales and fastest dynamics, such as burstiness of call sequences between individuals, and “zoom out” towards longer temporal and larger structural scales, from temporal motifs formed by correlated calls between multiple individuals to long-term dynamics of social groups. We conclude this overview with a future outlook.
From seconds to months: an overview of multi-scale dynamics of mobile telephone calls Jari Saramäki and Esteban Moro
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.