Papers
424.8K views | +180 today

 Scooped by Complexity Digest onto Papers

# Dynamical Systems on Networks: A Tutorial

We give a tutorial for the study of dynamical systems on networks, and we focus in particular on simple" situations that are tractable analytically. We briefly motivate why examining dynamical systems on networks is interesting and important. We then give several fascinating examples and discuss some theoretical results. We also discuss dynamical systems on dynamical (i.e., time-dependent) networks, overview software implementations, and give our outlook on the field.

Dynamical Systems on Networks: A Tutorial
Mason A. Porter, James P. Gleeson

http://arxiv.org/abs/1403.7663

No comment yet.

# Papers

Recent publications related to complex systems
 Scooped by Complexity Digest

## Chinese urbanization 2050: SD modeling and process simulation

Is Chinese urbanization going to take a long time, or can its development goal be achieved by the government in a short time? What is the highest stable urbanization level that China can reach? When can China complete its urbanization? To answer these questions, this paper presents a system dynamic (SD) model of Chinese urbanization, and its validity and simulation are justified by a stock-flow test and a sensitivity analysis using real data from 1998 to 2013. Setting the initial conditions of the simulation by referring to the real data of 2013, the multi-scenario analysis from 2013 to 2050 reveals that Chinese urbanization will reach a level higher than 70% in 2035 and then proceed to a slow urbanization stage regardless of the population policy and GDP growth rate settings; in 2050, Chinese urbanization levels will reach approximately 75%, which is a stable and equilibrium level for China. Thus, it can be argued that Chinese urbanization is a long social development process that will require approximately 20 years to complete and that the ultimate urbanization level will be 75–80%, which means that in the distant future, 20–25% of China’s population will still settle in rural regions of China.

Chinese urbanization 2050: SD modeling and process simulation
GU Chao Lin, GUAN Wei Hua, LIU He Lin

SCIENCE CHINA Earth Sciences 60(6), 1067-1082(2017);  10.1007/s11430-016-9022-2

No comment yet.
 Scooped by Complexity Digest

## Self-Organization and The Origins of Life: The Managed-Metabolism Hypothesis

The managed-metabolism hypothesis suggests that a cooperation barrier must be overcome if self-producing chemical organizations are to transition from non-life to life. This barrier prevents un-managed, self-organizing, autocatalytic networks of molecular species from individuating into complex, cooperative organizations. The barrier arises because molecular species that could otherwise make significant cooperative contributions to the success of an organization will often not be supported within the organization, and because side reactions and other free-riding processes will undermine cooperation. As a result, the barrier seriously limits the possibility space that can be explored by un-managed organizations, impeding individuation, complex functionality and the transition to life. The barrier can be overcome comprehensively by appropriate management which implements a system of evolvable constraints. The constraints support beneficial co-operators and suppress free riders. In this way management can manipulate the chemical processes of an autocatalytic organization, producing novel processes that serve the interests of the organization as a whole and that could not arise and persist spontaneously in an un-managed chemical organization. Management self-organizes because it is able to capture some of the benefits that are produced when its management of an autocatalytic organization promotes beneficial cooperation. Selection therefore favours the emergence of managers that take over and manage chemical organizations so as to overcome the cooperation barrier. The managed-metabolism hypothesis shows that if management is to overcome the cooperation barrier comprehensively, its interventions must be digitally coded. In this way, the hypothesis accounts for the two-tiered structure of all living cells in which a digitally-coded genetic apparatus manages an analogically-informed metabolism.

Self-Organization and The Origins of Life: The Managed-Metabolism Hypothesis
John E. Stewart

No comment yet.
 Scooped by Complexity Digest

## The Fall of the Empire: The Americanization of English

As global political preeminence gradually shifted from the United Kingdom to the United States, so did the capacity to culturally influence the rest of the world. In this work, we analyze how the world-wide varieties of written English are evolving. We study both the spatial and temporal variations of vocabulary and spelling of English using a large corpus of geolocated tweets and the Google Books datasets corresponding to books published in the US and the UK. The advantage of our approach is that we can address both standard written language (Google Books) and the more colloquial forms of microblogging messages (Twitter). We find that American English is the dominant form of English outside the UK and that its influence is felt even within the UK borders. Finally, we analyze how this trend has evolved over time and the impact that some cultural events have had in shaping it.

The Fall of the Empire: The Americanization of English
Bruno Gonçalves, Lucía Loureiro-Porto, José J. Ramasco, David Sánchez

No comment yet.
 Scooped by Complexity Digest

## [1707.01401] Optimal percolation on multiplex networks

Optimal percolation is the problem of finding the minimal set of nodes such that if the members of this set are removed from a network, the network is fragmented into non-extensive disconnected clusters. The solution of the optimal percolation problem has direct applicability in strategies of immunization in disease spreading processes, and influence maximization for certain classes of opinion dynamical models. In this paper, we consider the problem of optimal percolation on multiplex networks. The multiplex scenario serves to realistically model various technological, biological, and social networks. We find that the multilayer nature of these systems, and more precisely multiplex characteristics such as edge overlap and interlayer degree-degree correlation, profoundly changes the properties of the set of nodes identified as the solution of the optimal percolation problem.

Optimal percolation on multiplex networks
Saeed Osat, Ali Faqeeh, Filippo Radicchi

No comment yet.
 Scooped by Complexity Digest

## Dynamics of organizational culture: Individual beliefs vs. social conformity

The complex nature of organizational culture challenges our ability to infer its underlying dynamics from observational studies. Recent computational studies have adopted a distinctly different view, where plausible mechanisms are proposed to describe a wide range of social phenomena, including the onset and evolution of organizational culture. In this spirit, this work introduces an empirically-grounded, agent-based model which relaxes a set of assumptions that describes past work–(a) omittance of an individual’s strive for achieving cognitive coherence; (b) limited integration of important contextual factors—by utilizing networks of beliefs and incorporating social rank into the dynamics. As a result, we illustrate that: (i) an organization may appear to be increasingly coherent in terms of its organizational culture, yet be composed of individuals with reduced levels of coherence; (ii) the components of social conformity—peer-pressure and social rank—are influential at different aggregation levels.

Ellinas C, Allan N, Johansson A (2017) Dynamics of organizational culture: Individual beliefs vs. social conformity. PLoS ONE 12(6): e0180193. https://doi.org/10.1371/journal.pone.0180193

No comment yet.
 Scooped by Complexity Digest

## Scalable Co-Optimization of Morphology and Control in Embodied Machines

Evolution sculpts both the body plans and nervous systems of agents together over time. In contrast, in AI and robotics, a robot's body plan is usually designed by hand, and control policies are then optimized for that fixed design. The task of simultaneously co-optimizing the morphology and controller of an embodied robot has remained a challenge -- as evidenced by the little improvement upon early techniques over the decades since their introduction. Embodied cognition posits that behavior arises from a close coupling between body plan and sensorimotor control, which suggests why co-optimizing these two subsystems is so difficult: most evolutionary changes to morphology tend to adversely impact sensorimotor control, leading to an overall decrease in behavioral performance. Here, we further examine this hypothesis and demonstrate a technique for "morphological innovation protection", which reduces selection pressure on recently morphologically-changed individuals, thus enabling evolution some time to "readapt" to the new morphology with subsequent control policy mutations. This treatment tends to yield individuals that are significantly more fit than those that existed before the morphological change and increases evolvability. We also show the potential for this method to avoid local optima and show fitness increases further into optimization, as well as the potential for convergence to similar highly fit morphologies across widely varying initial conditions. While this technique is admittedly only the first of many steps that must be taken to achieve scalable optimization of embodied machines, we hope that theoretical insight into the cause of evolutionary stagnation in current methods will help to enable the automation of robot design and behavioral training.

Scalable Co-Optimization of Morphology and Control in Embodied Machines
Nick Cheney, Josh Bongard, Vytas SunSpiral, Hod Lipson

No comment yet.
 Scooped by Complexity Digest

## Looking into Pandora's Box: The Content of Sci-Hub and its Usage

Looking into Pandora's Box: The Content of Sci-Hub and its Usage

Bastian Greshake

F1000Research Article

No comment yet.
 Scooped by Complexity Digest

## How an ethics-based approach works with global agendas

Is “ethics” a useless word when it comes to politics, policy-making, business, international affairs, laws, governments and real-world situations? The rule of law exists and, to varying extent, it governs power systems, wills and decisions of individuals and organizations, determining the status quo we live with. Then, why have an ethics-based approach at all?

HOW AN ETHICS-BASED APPROACH WORKS WITH GLOBAL AGENDAS
BARIŞ BAYRAM

No comment yet.
 Scooped by Complexity Digest

## Characterizing information importance and the effect on the spread in various graph topologies

In this paper we present a thorough analysis of the nature of news in different mediums across the ages, introducing a unique mathematical model to fit the characteristics of information spread. This model enhances the information diffusion model to account for conflicting information and the topical distribution of news in terms of popularity for a given era. We translate this information to a separate graphical node model to determine the spread of a news item given a certain category and relevance factor. The two models are used as a base for a simulation of information dissemination for varying graph topoligies. The simulation is stress-tested and compared against real-world data to prove its relevancy. We are then able to use these simulations to deduce some conclusive statements about the optimization of information spread.

Characterizing information importance and the effect on the spread in various graph topologies
James Flamino, Alexander Norman, Madison Wyatt

No comment yet.
 Suggested by Fil Menczer

## Limited individual attention and online virality of low-quality information

Social media are massive marketplaces where ideas and news compete for our attention. Previous studies have shown that quality is not a necessary condition for online virality and that knowledge about peer choices can distort the relationship between quality and popularity. However, these results do not explain the viral spread of low-quality information, such as the digital misinformation that threatens our democracy. We investigate quality discrimination in a stylized model of an online social network, where individual agents prefer quality information, but have behavioural limitations in managing a heavy flow of information. We measure the relationship between the quality of an idea and its likelihood of becoming prevalent at the system level. We find that both information overload and limited attention contribute to a degradation of the market’s discriminative power. A good tradeoff between discriminative power and diversity of information is possible according to the model. However, calibration with empirical data characterizing information load and finite attention in real social media reveals a weak correlation between quality and popularity of information. In these realistic conditions, the model predicts that low-quality information is just as likely to go viral, providing an interpretation for the high volume of misinformation we observe online.

Limited individual attention and online virality of low-quality information
Xiaoyan Qiu, Diego F. M. Oliveira, Alireza Sahami Shirazi, Alessandro Flammini & Filippo Menczer

Nature Human Behaviour 1, Article number: 0132 (2017)
doi:10.1038/s41562-017-0132

No comment yet.
 Scooped by Complexity Digest

## Multiscale Information Theory and the Marginal Utility of Information

Complex systems display behavior at a range of scales. Large-scale behaviors can emerge from the correlated or dependent behavior of individual small-scale components. To capture this observation in a rigorous and general way, we introduce a formalism for multiscale information theory. Dependent behavior among system components results in overlapping or shared information. A system’s structure is revealed in the sharing of information across the system’s dependencies, each of which has an associated scale. Counting information according to its scale yields the quantity of scale-weighted information, which is conserved when a system is reorganized. In the interest of flexibility we allow information to be quantified using any function that satisfies two basic axioms. Shannon information and vector space dimension are examples. We discuss two quantitative indices that summarize system structure: an existing index, the complexity profile, and a new index, the marginal utility of information. Using simple examples, we show how these indices capture the multiscale structure of complex systems in a quantitative way.

Multiscale Information Theory and the Marginal Utility of Information
Benjamin Allen, Blake C. Stacey, and Yaneer Bar-Yam

Entropy 2017, 19(6), 273; doi:10.3390/e19060273

No comment yet.
 Scooped by Complexity Digest

## Zika virus evolution and spread in the Americas

One hundred and ten Zika virus genomes from ten countries and territories involved in the Zika virus epidemic reveal rapid expansion of the epidemic within Brazil and multiple introductions to other regions.

Zika virus evolution and spread in the Americas
Hayden C. Metsky, et al.

Nature 546, 411–415 (15 June 2017) doi:10.1038/nature22402

No comment yet.
 Scooped by Complexity Digest

## Collective benefits in traffic during mega events via the use of information technologies

Information technologies today can inform each of us about the route with the shortest time, but they do not contain incentives to manage travellers such that we all get collective benefits in travel times. To that end we need travel demand estimates and target strategies to reduce the traffic volume from the congested roads during peak hours in a feasible way. During large events, the traffic inconveniences in large cities are unusually high, yet temporary, and the entire population may be more willing to adopt collective recommendations for collective benefits in traffic. In this paper, we integrate, for the first time, big data resources to estimate the impact of events on traffic and propose target strategies for collective good at the urban scale. In the context of the Olympic Games in Rio de Janeiro, we first predict the expected increase in traffic. To that end, we integrate data from mobile phones, Airbnb, Waze and transit information, with game schedules and expected attendance in each venue. Next, we evaluate different route choice scenarios for drivers during the peak hours. Finally, we gather information on the trips that contribute the most to the global congestion which could be redirected from vehicles to transit. Interestingly, we show that (i) following new route alternatives during the event with individual shortest times can save more collective travel time than keeping the routine routes used before the event, uncovering the positive value of information technologies during events; (ii) with only a small proportion of people selected from specific areas switching from driving to public transport, the collective travel time can be reduced to a great extent. Results are presented online for evaluation by the public and policymakers

Collective benefits in traffic during mega events via the use of information technologies
Yanyan Xu, Marta C. González
Published 12 April 2017.DOI: 10.1098/rsif.2016.1041

Royal Society Interface

April 2017
Volume 14, issue 129

No comment yet.
 Suggested by Fil Menczer

## The spread of fake news by social bots

The massive spread of fake news has been identified as a major global risk and has been alleged to influence elections and threaten democracies. Communication, cognitive, social, and computer scientists are engaged in efforts to study the complex causes for the viral diffusion of digital misinformation and to develop solutions, while search and social media platforms are beginning to deploy countermeasures. However, to date, these efforts have been mainly informed by anecdotal evidence rather than systematic data. Here we analyze 14 million messages spreading 400 thousand claims on Twitter during and following the 2016 U.S. presidential campaign and election. We find evidence that social bots play a key role in the spread of fake news. Accounts that actively spread misinformation are significantly more likely to be bots. Automated accounts are particularly active in the early spreading phases of viral claims, and tend to target influential users. Humans are vulnerable to this manipulation, retweeting bots who post false news. Successful sources of false and biased claims are heavily supported by social bots. These results suggests that curbing social bots may be an effective strategy for mitigating the spread of online misinformation.

The spread of fake news by social bots

Chengcheng Shao, Giovanni Luca Ciampaglia, Onur Varol, Alessandro Flammini, Filippo Menczer

No comment yet.
 Scooped by Complexity Digest

## Thermodynamics of Evolutionary Games

How cooperation can evolve between players is an unsolved problem of biology. Here we use Hamiltonian dynamics of models of the Ising type to describe populations of cooperating and defecting players to show that the equilibrium fraction of cooperators is given by the expectation value of a thermal observable akin to a magnetization. We apply the formalism to the Public Goods game with three players, and show that a phase transition between cooperation and defection occurs that is equivalent to a transition in one-dimensional Ising crystals with long-range interactions. We also investigate the effect of punishment on cooperation and find that punishment acts like a magnetic field that leads to an "alignment" between players, thus encouraging cooperation. We suggest that a thermal Hamiltonian picture of the evolution of cooperation can generate other insights about the dynamics of evolving groups by mining the rich literature of critical dynamics in low-dimensional spin systems.

Thermodynamics of Evolutionary Games

No comment yet.
 Scooped by Complexity Digest

## To slow, or not to slow? New science in sub-second networks

What happens when you slow down part of an ultrafast network that is operating quicker than the blink of an eye, e.g. electronic exchange network, navigational systems in driverless vehicles, or even neuronal processes in the brain? This question just adopted immediate commercial, legal and political importance following U.S. financial regulators' decision to allow a new network node to intentionally introduce delays of microseconds. Though similar requests are set to follow, there is still no scientific understanding available to policymakers of the likely system-wide impact of such delays. Giving academic researchers access to (so far prohibitively expensive) microsecond exchange data would help rectify this situation. As a by-product, the lessons learned would deepen understanding of instabilities across myriad other networks, e.g. impact of millisecond delays on brain function and safety of driverless vehicle navigation systems beyond human response times.

To slow, or not to slow? New science in sub-second networks

Neil F. Johnson

No comment yet.
 Scooped by Complexity Digest

## Development of structural correlations and synchronization from adaptive rewiring in networks of Kuramoto oscillators

Synchronization of non-identical oscillators coupled through complex networks is an important example of collective behavior. It is interesting to ask how the structural organization of network interactions influences this process. Several studies have uncovered optimal topologies for synchronization by making purposeful alterations to a network. Yet, the connectivity patterns of many natural systems are often not static, but are rather modulated over time according to their dynamics. This co-evolution - and the extent to which the dynamics of the individual units can shape the organization of the network itself - is not well understood. Here, we study initially randomly connected but locally adaptive networks of Kuramoto oscillators. The system employs a co-evolutionary rewiring strategy that depends only on instantaneous, pairwise phase differences of neighboring oscillators, and that conserves the total number of edges, allowing the effects of local reorganization to be isolated. We find that a simple regulatory rule - which preserves connections between more out-of-phase oscillators while rewiring connections between more in-phase oscillators - can cause initially disordered networks to organize into more structured topologies that support enhanced synchronization dynamics. We examine how this process unfolds over time, finding both a dependence on the intrinsic frequencies of the oscillators and the global coupling. For large enough coupling and after sufficient adaptation, the resulting networks exhibit degree - frequency and frequency - neighbor frequency correlations. These properties have previously been associated with optimal synchronization or explosive transitions. By considering a time-dependent interplay between structure and dynamics, this work offers a mechanism through which emergent phenomena can arise in complex systems utilizing local rules.

Development of structural correlations and synchronization from adaptive rewiring in networks of Kuramoto oscillators
Lia Papadopoulos, Jason Kim, Jurgen Kurths, Danielle S. Bassett

No comment yet.
 Scooped by Complexity Digest

## [1706.05043] The thermodynamic efficiency of computations made in cells across the range of life

Biological organisms must perform computation as they grow, reproduce, and evolve. Moreover, ever since Landauer's bound was proposed it has been known that all computation has some thermodynamic cost -- and that the same computation can be achieved with greater or smaller thermodynamic cost depending on how it is implemented. Accordingly an important issue concerning the evolution of life is assessing the thermodynamic efficiency of the computations performed by organisms. This issue is interesting both from the perspective of how close life has come to maximally efficient computation (presumably under the pressure of natural selection), and from the practical perspective of what efficiencies we might hope that engineered biological computers might achieve, especially in comparison with current computational systems. Here we show that the computational efficiency of translation, defined as free energy expended per amino acid operation, outperforms the best supercomputers by several orders of magnitude, and is only about an order of magnitude worse than the Landauer bound. However this efficiency depends strongly on the size and architecture of the cell in question. In particular, we show that the {\it useful} efficiency of an amino acid operation, defined as the bulk energy per amino acid polymerization, decreases for increasing bacterial size and converges to the polymerization cost of the ribosome. This cost of the largest bacteria does not change in cells as we progress through the major evolutionary shifts to both single and multicellular eukaryotes. However, the rates of total computation per unit mass are nonmonotonic in bacteria with increasing cell size, and also change across different biological architectures including the shift from unicellular to multicellular eukaryotes.

The thermodynamic efficiency of computations made in cells across the range of life
Christopher P. Kempes, David Wolpert, Zachary Cohen, Juan Pérez-Mercader

ComplexInsight's curator insight,
The concept of computation as it occurs in biology is fascinating and this paper is likely to become a-classic - worth reading.
 Scooped by Complexity Digest

## Mathematicians Decode the Surprising Complexity of Cow Herds

DO ME A favor and picture a pasture dotted with a herd of grazing cows. Some stand and stare at you with that patented cow stare, others bury their heads in the green, green grass, while still others have laid down for a rest. Tranquil, right? About as simple as life gets?
Well, I’m sorry to say that your idea of the herd life may be a lie. Because a new mathematical model posits that while they don’t look it, cow herds may be extremely dynamic, secretly contentious gatherings of warring interests. Yes, with the help of a biologist, mathematicians calculated the fascinating dynamics of cow herds, and yes, they reported it today in a journal called Chaos.
No comment yet.
 Scooped by Complexity Digest

## Universal fractality of morphological transitions in stochastic growth processes

Stochastic growth processes give rise to diverse and intricate structures everywhere in nature, often referred to as fractals. In general, these complex structures reflect the non-trivial competition among the interactions that generate them. In particular, the paradigmatic Laplacian-growth model exhibits a characteristic fractal to non-fractal morphological transition as the non-linear effects of its growth dynamics increase. So far, a complete scaling theory for this type of transitions, as well as a general analytical description for their fractal dimensions have been lacking. In this work, we show that despite the enormous variety of shapes, these morphological transitions have clear universal scaling characteristics. Using a statistical approach to fundamental particle-cluster aggregation, we introduce two non-trivial fractal to non-fractal transitions that capture all the main features of fractal growth. By analyzing the respective clusters, in addition to constructing a dynamical model for their fractal dimension, we show that they are well described by a general dimensionality function regardless of their space symmetry-breaking mechanism, including the Laplacian case itself. Moreover, under the appropriate variable transformation this description is universal, i.e., independent of the transition dynamics, the initial cluster configuration, and the embedding Euclidean space.

Universal fractality of morphological transitions in stochastic growth processes
J. R. Nicolás-Carlock, J. L. Carrillo-Estrada & V. Dossetti
Scientific Reports 7, Article number: 3523 (2017)
doi:10.1038/s41598-017-03491-5

nukem777's curator insight,

Golden Apple to anyone who translates this into English.

 Scooped by Complexity Digest

## Empowerment As Replacement for the Three Laws of Robotics

The greater ubiquity of robots creates a need for generic guidelines for robot behavior. We focus less on how a robot can technically achieve a predefined goal and more on what a robot should do in the first place. Particularly, we are interested in the question how a heuristic should look like, which motivates the robot’s behavior in interaction with human agents. We make a concrete, operational proposal as to how the information-theoretic concept of empowerment can be used as a generic heuristic to quantify concepts, such as self-preservation, protection of the human partner, and responding to human actions. While elsewhere we studied involved single-agent scenarios in detail, here, we present proof-of-principle scenarios demonstrating how empowerment interpreted in light of these perspectives allows one to specify core concepts with a similar aim as Asimov’s Three Laws of Robotics in an operational way. Importantly, this route does not depend on having to establish an explicit verbalized understanding of human language and conventions in the robots. Also, it incorporates the ability to take into account a rich variety of different situations and types of robotic embodiment.

Empowerment As Replacement for the Three Laws of Robotics

Christoph Salge, Daniel Polani

Front. Robot. AI, 29 June 2017 | https://doi.org/10.3389/frobt.2017.00025

No comment yet.
 Scooped by Complexity Digest

## 1D Printing of Recyclable Robots

Recent advances in 3D printing are revolutionizing manufacturing, enabling the fabrication of structures with unprecedented complexity and functionality. Yet biological systems are able to fabricate systems with far greater complexity using a process that involves assembling and folding a linear string. Here, we demonstrate a 1D printing system that uses an approach inspired by the ribosome to fabricate a variety of specialized robotic automata from a single string of source material. This proof-of-concept system involves both a novel manufacturing platform that configures the source material using folding and a computational optimization tool that allows designs to be produced from the specification of high-level goals. We show that our 1D printing system is able to produce three distinct robots from the same source material, each of which is capable of accomplishing a specialized locomotion task. Moreover, we demonstrate the ability of the printer to use recycled material to produce new designs, enabling an autonomous manufacturing ecosystem capable of repurposing previous iterations to accomplish new tasks.

Title: 1D Printing of  Recyclable Robots
Authors: Daniel Cellucci, Robert MacCurdy, Hod Lipson, Sebastian Risi (2017)
In: EEE Robotics and Automation Letters (RA-L).
Video: https://youtu.be/ElW0O2IiuXA

No comment yet.
 Suggested by mohsen mosleh

## Efficient Integration in Multi-Community Networks

We study structures for efficient integration of multi-community networks where building bridges across communities incur an additional link cost compared to links within a community. Building on the connections models with direct link cost and direct and indirect benefits, we show that the efficient structure for homogeneous cost and benefit parameters, and for communities of arbitrary size, always has a diameter no greater than 3. We further show that if the internal cost is not small enough to justify a full graph for each community, integration always follows one of these two structures: Either a single star, or a new structure we introduce in this paper, called parallel-hyperstar, which is a special multi-core/periphery structure with parallel links among core nodes of different communities. We offer cost and benefit conditions where each structure is efficient and discuss the stability conditions of those structures.

No comment yet.
 Scooped by Complexity Digest

## How to fight corruption

Anticorruption initiatives are often put forth as solutions to problems of waste and inefficiency in government programs. It's easy to see why. So often, somewhere along the chain that links the many participants in public service provision or other government activities, funds may get stolen or misdirected, bribes exchanged for preferential treatment, or genuine consumers of public services supplemented by “ghost” users. As a result, corruption reduces economic growth and leaves citizens disillusioned and distrustful of government. It is tempting to think that more monitoring, stricter sanctions, or positive inducements for suitable behavior will reduce corruption. But every anticorruption or antifraud program elicits a strategic response by those who orchestrated and benefited from wrongdoing in the first place. How can these unintended consequences be anticipated and avoided?

How to fight corruption
Raymond Fisman, Miriam Golden

Science  26 May 2017:
Vol. 356, Issue 6340, pp. 803-804
DOI: 10.1126/science.aan081

No comment yet.
 Scooped by Complexity Digest

## The Human Microbiome and the Missing Heritability Problem

The “missing heritability” problem states that genetic variants in Genome-Wide Association Studies (GWAS) cannot completely explain the heritability of complex traits. Traditionally, the heritability of a phenotype is measured through familial studies using twins, siblings and other close relatives, making assumptions on the genetic similarities between them. When this heritability is compared to the one obtained through GWAS for the same traits, a substantial gap between both measurements arise with genome wide studies reporting significantly smaller values. Several mechanisms for this “missing heritability” have been proposed, such as epigenetics, epistasis, and sequencing depth. However, none of them are able to fully account for this gap in heritability. In this paper we provide evidence that suggests that in order for the phenotypic heritability of human traits to be broadly understood and accounted for, the compositional and functional diversity of the human microbiome must be taken into account. This hypothesis is based on several observations: (A) The composition of the human microbiome is associated with many important traits, including obesity, cancer, and neurological disorders. (B) Our microbiome encodes a second genome with nearly a 100 times more genes than the human genome, and this second genome may act as a rich source of genetic variation and phenotypic plasticity. (C) Human genotypes interact with the composition and structure of our microbiome, but cannot by themselves explain microbial variation. (D) Microbial genetic composition can be strongly influenced by the host's behavior, its environment or by vertical and horizontal transmissions from other hosts. Therefore, genetic similarities assumed in familial studies may cause overestimations of heritability values. We also propose a method that allows the compositional and functional diversity of our microbiome to be incorporated to genome wide association studies.

The Human Microbiome and the Missing Heritability Problem

Santiago Sandoval-Motta, Maximino Aldana, Esperanza Martínez-Romero and Alejandro Frank

Front. Genet., 13 June 2017 | https://doi.org/10.3389/fgene.2017.00080

No comment yet.