Papers
442.2K views | +9 today
Follow
 
Scooped by Complexity Digest
onto Papers
Scoop.it!

Coauthorship and citation in scientific publishing

A large number of published studies have examined the properties of either networks of citation among scientific papers or networks of coauthorship among scientists. Here, using an extensive data set covering more than a century of physics papers published in the Physical Review, we study a hybrid coauthorship/citation network that combines the two, which we analyze to gain insight into the correlations and interactions between authorship and citation. Among other things, we investigate the extent to which individuals tend to cite themselves or their collaborators more than others, the extent to which they cite themselves or their collaborators more quickly after publication, and the extent to which they tend to return the favor of a citation from another scientist.

 

Coauthorship and citation in scientific publishing

Travis Martin, Brian Ball, Brian Karrer, M. E. J. Newman

http://arxiv.org/abs/1304.0473

more...
No comment yet.
Papers
Recent publications related to complex systems
Your new post is loading...
Your new post is loading...
Scooped by Complexity Digest
Scoop.it!

What is consciousness, and could machines have it?

The controversial question of whether machines may ever be conscious must be based on a careful consideration of how consciousness arises in the only physical system that undoubtedly possesses it: the human brain. We suggest that the word “consciousness” conflates two different types of information-processing computations in the brain: the selection of information for global broadcasting, thus making it flexibly available for computation and report (C1, consciousness in the first sense), and the self-monitoring of those computations, leading to a subjective sense of certainty or error (C2, consciousness in the second sense). We argue that despite their recent successes, current machines are still mostly implementing computations that reflect unconscious processing (C0) in the human brain. We review the psychological and neural science of unconscious (C0) and conscious computations (C1 and C2) and outline how they may inspire novel machine architectures.

 

What is consciousness, and could machines have it?
Stanislas Dehaene, Hakwan Lau, Sid Kouider

Science  27 Oct 2017:
Vol. 358, Issue 6362, pp. 486-492
DOI: 10.1126/science.aan8871

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

How the Body’s Trillions of Clocks Keep Time

How the Body’s Trillions of Clocks Keep Time | Papers | Scoop.it
Cellular clocks are almost everywhere. Clues to how they work are coming from the places that they’re not.
more...
nukem777's curator insight, December 2, 5:33 AM
And that explains why I'm always late
Scooped by Complexity Digest
Scoop.it!

A multilayer network analysis of hashtags in twitter via co-occurrence and semantic links

Complex network studies, as an interdisciplinary framework, span a large variety of subjects including social media. In social networks, several mechanisms generate miscellaneous structures like friendship networks, mention networks, tag networks, etc. Focusing on tag networks (namely, hashtags in twitter), we made a two-layer analysis of tag networks from a massive dataset of Twitter entries. The first layer is constructed by converting the co-occurrences of these tags in a single entry (tweet) into links, while the second layer is constructed converting the semantic relations of the tags into links. We observed that the universal properties of the real networks like small-world property, clustering and power-law distributions in various network parameters are also evident in the multilayer network of hashtags. Moreover, we outlined that co-occurrences of hashtags in tweets are mostly coupled with semantic relations, whereas a small number of semantically unrelated, therefore random links reduce node separation and network diameter in the co-occurrence network layer. Together with the degree distributions, the power-law consistencies of degree difference, edge weight and cosine similarity distributions in both layers are also appealing forms of Zipf’s law evident in nature.

 

İlker Türker and Eyüb Ekmel Sulak, Int. J. Mod. Phys. B
https://doi.org/10.1142/S0217979218500297
A multilayer network analysis of hashtags in twitter via co-occurrence and semantic links

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

A semi-synthetic organism that stores and retrieves increased genetic information

Since at least the last common ancestor of all life on Earth, genetic information has been stored in a four-letter alphabet that is propagated and retrieved by the formation of two base pairs. The central goal of synthetic biology is to create new life forms and functions, and the most general route to this goal is the creation of semi-synthetic organisms whose DNA harbours two additional letters that form a third, unnatural base pair. Previous efforts to generate such semi-synthetic organisms culminated in the creation of a strain of Escherichia colithat, by virtue of a nucleoside triphosphate transporter from Phaeodactylum tricornutum, imports the requisite unnatural triphosphates from its medium and then uses them to replicate a plasmid containing the unnatural base pair dNaM–dTPT3. Although the semi-synthetic organism stores increased information when compared to natural organisms, retrieval of the information requires in vivotranscription of the unnatural base pair into mRNA and tRNA, aminoacylation of the tRNA with a non-canonical amino acid, and efficient participation of the unnatural base pair in decoding at the ribosome. Here we report the in vivo transcription of DNA containing dNaM and dTPT3 into mRNAs with two different unnatural codons and tRNAs with cognate unnatural anticodons, and their efficient decoding at the ribosome to direct the site-specific incorporation of natural or non-canonical amino acids into superfolder green fluorescent protein. The results demonstrate that interactions other than hydrogen bonding can contribute to every step of information storage and retrieval. The resulting semi-synthetic organism both encodes and retrieves increased information and should serve as a platform for the creation of new life forms and functions.

 

A semi-synthetic organism that stores and retrieves increased genetic information
Yorke Zhang, Jerod L. Ptacin, Emil C. Fischer, Hans R. Aerni, Carolina E. Caffaro, Kristine San Jose, Aaron W. Feldman, Court R. Turner & Floyd E. Romesberg
Nature 551, 644–647 (30 November 2017)
doi:10.1038/nature24659

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

How the news media activate public expression and influence national agendas

We demonstrate that exposure to the news media causes Americans to take public stands on specific issues, join national policy conversations, and express themselves publicly—all key components of democratic politics—more often than they would otherwise. After recruiting 48 mostly small media outlets, we chose groups of these outlets to write and publish articles on subjects we approved, on dates we randomly assigned. We estimated the causal effect on proximal measures, such as website pageviews and Twitter discussion of the articles’ specific subjects, and distal ones, such as national Twitter conversation in broad policy areas. Our intervention increased discussion in each broad policy area by ~62.7% (relative to a day’s volume), accounting for 13,166 additional posts over the treatment week, with similar effects across population subgroups.

 

How the news media activate public expression and influence national agendas
Gary King, Benjamin Schneer, Ariel White

Science  10 Nov 2017:
Vol. 358, Issue 6364, pp. 776-780
DOI: 10.1126/science.aao1100

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

A Fantastic Voyage in Genomics

A Fantastic Voyage in Genomics | Papers | Scoop.it

Imagine being able to shrink down to a small enough size to peer into the human body at the single-cell level. Now take a deep breath and plunge into that cell to see all of the ongoing biological processes, including the full complement of molecules and their locations within the cell. This has long been the realm of science fiction, but not for much longer. Recent technological advances now allow us to identify and visualize RNA transcripts, proteins, and other cellular components at the single-cell level. This has led to discoveries about the immune system, brain, and developmental processes and is poised to revolutionize our understanding of the entire human body.

 

A Fantastic Voyage in Genomics
Laura M. Zahn

Science  06 Oct 2017:
Vol. 358, Issue 6359, pp. 56-57
DOI: 10.1126/science.358.6359.56

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Rescuing Collective Wisdom when the Average Group Opinion Is Wrong

The total knowledge contained within a collective supersedes the knowledge of even its most intelligent member. Yet the collective knowledge will remain inaccessible to us unless we are able to find efficient knowledge aggregation methods that produce reliable decisions based on the behavior or opinions of the collective’s members. It is often stated that simple averaging of a pool of opinions is a good and in many cases the optimal way to extract knowledge from a crowd. The method of averaging has been applied to analysis of decision-making in very different fields, such as forecasting, collective animal behavior, individual psychology, and machine learning. Two mathematical theorems, Condorcet’s theorem and Jensen’s inequality, provide a general theoretical justification for the averaging procedure. Yet the necessary conditions which guarantee the applicability of these theorems are often not met in practice. Under such circumstances, averaging can lead to suboptimal and sometimes very poor performance. Practitioners in many different fields have independently developed procedures to counteract the failures of averaging. We review such knowledge aggregation procedures and interpret the methods in the light of a statistical decision theory framework to explain when their application is justified. Our analysis indicates that in the ideal case, there should be a matching between the aggregation procedure and the nature of the knowledge distribution, correlations, and associated error costs. This leads us to explore how machine learning techniques can be used to extract near-optimal decision rules in a data-driven manner. We end with a discussion of open frontiers in the domain of knowledge aggregation and collective intelligence in general.

 

Rescuing Collective Wisdom when the Average Group Opinion Is Wrong

Andres Laan, Gabriel Madirolas, and Gonzalo G. de Polavieja

Front. Robot. AI, 06 November 2017 | https://doi.org/10.3389/frobt.2017.00056

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

The fundamental advantages of temporal networks

Historically, network science focused on static networks, in which nodes are connected by permanent links. However, in networked systems ranging from protein-protein interactions to social networks, links change. Although it might seem that permanent links would make it easier to control a system, Li et al. demonstrate that temporality has advantages in real and simulated networks. Temporal networks can be controlled more efficiently and require less energy than their static counterparts.

 

The fundamental advantages of temporal networks
A. Li, S. P. Cornelius, Y.-Y. Liu, L. Wang, A.-L. Barabási
Science  24 Nov 2017:
Vol. 358, Issue 6366, pp. 1042-1046
DOI: 10.1126/science.aai7488

more...
Marcelo Errera's curator insight, November 27, 6:57 PM
One more interesting article on the evolution of networks. Networks, or anything, can only evolve in time only if there is degree of freedom. Live networks will never stay static in time. If it's static, it must be dead in the design evolution sense.
It's a law of physics.
Scooped by Complexity Digest
Scoop.it!

Social Complex Contagion in Music Listenership: A Natural Experiment with 1.3 Million Participants

Can live music events generate complex contagion in music streaming? This paper finds evidence in the affirmative, but only for the most popular artists. We generate a novel dataset from Last.fm, a music tracking website, to analyse the listenership history of 1.3 million users over a two-month time horizon. We use daily play counts along with event attendance data to run a regression discontinuity analysis in order to show the causal impact of concert attendance on music listenership among attendees and their friends network. First, we show that attending a music artist's live concert increases that artist's listenership among the attendees of the concert by approximately 1 song per day per attendee (p-value<0.001). Moreover, we show that this effect is contagious and can spread to users who did not attend the event. However, the extent of contagion depends on the type of artist. We only observe contagious increases in listenership for well-established, popular artists (.06 more daily plays per friend of an attendee [p<0.001]), while the effect is absent for emerging stars. We also show that the contagion effect size increases monotonically with the number of friends who have attended the live event.

 

Social Complex Contagion in Music Listenership: A Natural Experiment with 1.3 Million Participants
John Ternovski, Taha Yasseri

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Algorithmic Cognition and the Computational Nature of the Mind

The idea that complexity or, its reverse, simplicity are essential concepts for cognitive psychology was already understood in the middle of the twentieth century (Mach 1914), and these concepts have remained salient ever since (Oizumi et al. 2014). As early as the 1990s, the algorithmic theory of information was referenced by some researchers in psychology, who recommended the use of algorithmic complexity as a universal normative measure of complexity. Nevertheless, the noncomputability of algorithmic complexity was deemed an insurmountable obstacle, and more often than not it merely served as a point of reference.
In recent years, we have been able to create and use more reliable estimates of algorithmic complexity using the coding theorem method (Gauvrit et al. 2014b, 2016). This has made it possible to deploy a precise and quantitative approximation of algorithmic complexity, with applications in many areas of psychology and the behavioral sciences – sometimes ...

 

Algorithmic Cognition and the Computational Nature of the Mind

Hector Zenil , Nicolas Gauvrit

Living Reference Work Entry
Encyclopedia of Complexity and Systems Science
pp 1-9

more...
nukem777's curator insight, November 24, 12:37 PM
not for the feint of heart

Rescooped by Complexity Digest from Self-organizing, Systems and Complexity
Scoop.it!

How Self-Organization Happens

Self-organization refers to natural processes of human relating, that are similar at all scales of order in the natural world.The dynamics of self-organization are much more rich and complex than the simple patterns we use to model them.Being able to make sense of these dynamics enables us to build new potentials in teams. The level of trust rises when we recognize our basic human capacity to collaborate with each other. Narrative-based applications can visualize some of the subtle patterns that shape a team’s potential for acting in certain ways (and not others) over time.
There isn't one specific pattern that emerges from self-organization. The processes are so deep and fundamental to human interactions, that you cannot enforce any specific hierarchical or non-hierarchical pattern with rules.  Trust between people is an outcome of allowing people to freely self-organize. Complex networks of trust emerge and change as people continuously negotiate their relationships.


Via june holley
more...
nukem777's curator insight, November 24, 12:38 PM
Finally, something practical....kudpos
Scooped by Complexity Digest
Scoop.it!

Stream Graphs and Link Streams for the Modeling of Interactions over Time

Graph theory provides a language for studying the structure of relations, and it is often used to study interactions over time too. However, it poorly captures the both temporal and structural nature of interactions, that calls for a dedicated formalism. In this paper, we generalize graph concepts in order to cope with both aspects in a consistent way. We start with elementary concepts like density, clusters, or paths, and derive from them more advanced concepts like cliques, degrees, clustering coefficients, or connected components. We obtain a language to directly deal with interactions over time, similar to the language provided by graphs to deal with relations. This formalism is self-consistent: usual relations between different concepts are preserved. It is also consistent with graph theory: graph concepts are special cases of the ones we introduce. This makes it easy to generalize higher-level objects such as quotient graphs, line graphs, k-cores, and centralities. This paper also considers discrete versus continuous time assumptions, instantaneous links, and extensions to more complex cases.

 

Stream Graphs and Link Streams for the Modeling of Interactions over Time
Matthieu Latapy, Tiphaine Viard, Clémence Magnien

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Small vulnerable sets determine large network cascades in power grids

Sometimes a power failure can be fairly local, but other times, a seemingly identical initial failure can cascade to cause a massive and costly breakdown in the system. Yang et al. built a model for the North American power grid network based on samples of data covering the years 2008 to 2013 (see the Perspective by D'Souza). Although the observed cascades were widespread, a small fraction of all network components, particularly the ones that were most cohesive within the network, were vulnerable to cascading failures. Larger cascades were associated with concurrent triggering events that were geographically closer to each other and closer to the set of vulnerable components.

 

Small vulnerable sets determine large network cascades in power grids
Yang Yang, Takashi Nishikawa, Adilson E. Motter

Science  17 Nov 2017:
Vol. 358, Issue 6365, eaan3184
DOI: 10.1126/science.aan3184

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Expanding and reprogramming the genetic code

Expanding and reprogramming the genetic code | Papers | Scoop.it

Nature uses a limited, conservative set of amino acids to synthesize proteins. The ability to genetically encode an expanded set of building blocks with new chemical and physical properties is transforming the study, manipulation and evolution of proteins, and is enabling diverse applications, including approaches to probe, image and control protein function, and to precisely engineer therapeutics. Underpinning this transformation are strategies to engineer and rewire translation. Emerging strategies aim to reprogram the genetic code so that noncanonical biopolymers can be synthesized and evolved, and to test the limits of our ability to engineer the translational machinery and systematically recode genomes.

 

Expanding and reprogramming the genetic code
Jason W. Chin
Nature 550, 53–60 (05 October 2017)
doi:10.1038/nature24031

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

A proposed methodology for studying the historical trajectory of words’ meaning through Tsallis entropy

The availability of historical textual corpora has led to the study of words’ frequency along the historical time line, as representing the public’s focus of attention over time. However, studying of the dynamics of words’ meaning is still in its infancy. In this paper, we propose a methodology for studying the historical trajectory of words’ meaning through Tsallis entropy. First, we present the idea that the meaning of a word may be studied through the entropy of its embedding. Using two historical case studies, we show that this entropy measure is correlated with the intensity in which a word is used. More importantly, we show that using Tsallis entropy with a superadditive entropy index may provide a better estimation of a word’s frequency of use than using Shannon entropy. We explain this finding as resulting from an increasing redundancy between the words that comprise the semantic field of the target word and develop a new measure of redundancy between words. Using this measure, which relies on the Tsallis version of the Kullback–Leibler divergence, we show that the evolving meaning of a word involves the dynamics of increasing redundancy between components of its semantic field. The proposed methodology may enrich the toolkit of researchers who study the dynamics of word senses.

 

Neuman, Y., Cohen, Y., Israeli, N., & Tamir, B. (2017). A proposed methodology for studying the historical trajectory of words’ meaning through Tsallis entropy. Physica A: Statistical Mechanics and its Applications.

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Roads to riches or ruin?

We are living in the most explosive era of infrastructure expansion in human history (1, 2). In the next 3 years, paved roads are projected to double in length in Asia's developing nations (3); in the next three decades, the total length of additional paved roads could approach 25 million kilometers worldwide—enough to encircle the planet more than 600 times (1). Nine-tenths of all new infrastructure is being built in developing nations (1), mainly in tropical and subtropical regions that contain Earth's most diverse ecosystems. In a world that is projected to have 2 billion vehicles by 2030 (4), we need a better understanding of the impacts of roads and other infrastructure on our planet, societies, and economies (1–3, 5)—and more effective planning to ensure that the benefits of infrastructure outstrip its costs.

 

Roads to riches or ruin?
William F. Laurance, Irene Burgués Arrea

Science  27 Oct 2017:
Vol. 358, Issue 6362, pp. 442-444
DOI: 10.1126/science.aao0312

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Strains, functions and dynamics in the expanded Human Microbiome Project

The characterization of baseline microbial and functional diversity in the human microbiome has enabled studies of microbiome-related disease, diversity, biogeography, and molecular function. The National Institutes of Health Human Microbiome Project has provided one of the broadest such characterizations so far. Here we introduce a second wave of data from the study, comprising 1,631 new metagenomes (2,355 total) targeting diverse body sites with multiple time points in 265 individuals. We applied updated profiling and assembly methods to provide new characterizations of microbiome personalization. Strain identification revealed subspecies clades specific to body sites; it also quantified species with phylogenetic diversity under-represented in isolate genomes. Body-wide functional profiling classified pathways into universal, human-enriched, and body site-enriched subsets. Finally, temporal analysis decomposed microbial variation into rapidly variable, moderately variable, and stable subsets. This study furthers our knowledge of baseline human microbial diversity and enables an understanding of personalized microbiome function and dynamics.

 

Strains, functions and dynamics in the expanded Human Microbiome Project
Jason Lloyd-Price, Anup Mahurkar, Gholamali Rahnavard, Jonathan Crabtree, Joshua Orvis, A. Brantley Hall, Arthur Brady, Heather H. Creasy, Carrie McCracken, Michelle G. Giglio, Daniel McDonald, Eric A. Franzosa, Rob Knight, Owen White & Curtis Huttenhower
Nature 550, 61–66 (05 October 2017)
doi:10.1038/nature23889

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Five ways to fix statistics

Five ways to fix statistics | Papers | Scoop.it
As debate rumbles on about how and how much poor statistics is to blame for poor reproducibility, Nature asked influential statisticians to recommend one change to improve science. The common theme? The problem is not our maths, but ourselves.
more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Failure of incentives in multiplex networks

Governments and enterprises strongly rely on incentives to generate favorable outcomes from social and strategic interactions between individuals, for example climate or environmental friendly actions. The incentives are usually modeled by payoffs in strategical games, such as the prisoner's dilemma or the harmony game. Adjusting the incentives by changing the payoff parameters e.g. through tax schemes can favor cooperation (harmony) over defection (prisoner's dilemma). Here, we show that if individuals engage in strategic interactions in multiple domains, incentives can fail and the final outcome, cooperation or defection, is dominated by the initial state of the system. Our findings highlight the importance to take the multilayer structure of human interactions into account and emphasize the importance to rethink payoff-based incentives.

 

Failure of incentives in multiplex networks
Kaj-Kolja Kleineberg, Dirk Helbing

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Big Data Fusion to Estimate Fuel Consumption: A Case Study of Riyadh

Falling oil revenues and rapid urbanization are putting a strain on the budgets of oil producing nations which often subsidize domestic fuel consumption. A direct way to decrease the impact of subsidies is to reduce fuel consumption by reducing congestion and car trips. While fuel consumption models have started to incorporate data sources from ubiquitous sensing devices, the opportunity is to develop comprehensive models at urban scale leveraging sources such as Global Positioning System (GPS) data and Call Detail Records. We combine these big data sets in a novel method to model fuel consumption within a city and estimate how it may change due to different scenarios. To do so we calibrate a fuel consumption model for use on any car fleet fuel economy distribution and apply it in Riyadh, Saudi Arabia. The model proposed, based on speed profiles, is then used to test the effects on fuel consumption of reducing flow, both randomly and by targeting the most fuel inefficient trips in the city. The estimates considerably improve baseline methods based on average speeds, showing the benefits of the information added by the GPS data fusion. The presented method can be adapted to also measure emissions. The results constitute a clear application of data analysis tools to help decision makers compare policies aimed at achieving economic and environmental goals.

 

Big Data Fusion to Estimate Fuel Consumption: A Case Study of Riyadh
Adham Kalila, Zeyad Awwad, Riccardo Di Clemente, Marta C. González

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

The architecture of mutualistic networks as an evolutionary spandrel

Mutualistic networks have been shown to involve complex patterns of interactions among animal and plant species, including a widespread presence of nestedness. The nested structure of these webs seems to be positively correlated with higher diversity and resilience. Moreover, these webs exhibit marked measurable structural patterns, including broad distributions of connectivity, strongly asymmetrical interactions and hierarchical organization. Hierarchical organization is an especially interesting property, since it is positively correlated with biodiversity and network resilience, thus suggesting potential selection processes favouring the observed web organization. However, here we show that all these structural quantitative patterns—and nestedness in particular—can be properly explained by means of a very simple dynamical model of speciation and divergence with no selection-driven coevolution of traits. The agreement between observed and modelled networks suggests that the patterns displayed by real mutualistic webs might actually represent evolutionary spandrels.

 

The architecture of mutualistic networks as an evolutionary spandrel
Sergi Valverde, Jordi Piñero, Bernat Corominas-Murtra, Jose Montoya, Lucas Joppa & Ricard Solé
Nature Ecology & Evolution (2017)
doi:10.1038/s41559-017-0383-4

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Social network and temporal discounting

For reasons of social influence and social logistics, people in closed networks are expected to experience time compression: The more closed a person's network, the steeper the person's discount function, and the more narrow the expected time horizon within which the person deliberates events and behavior. Consistent with the hypothesis, data on managers at the top of three organizations show network closure associated with a social life compressed into daily contact with colleagues. Further, language in closed networks is predominantly about current activities, ignoring the future. Further still, discount functions employed by executive MBA students show more severe discounting by students in more closed networks. Inattention to the future can be argued to impair achievement, however, I find no evidence across the managers of daily contact diminishing the achievement associated with network advantage. I close with comments on replication and extrapolation to language more generally, within-person variation, and select cognitive patterns (closure bias, end of history, and felt status loss).

 

 

Social network and temporal discounting

RONALD S. BURT
Network Science, Volume 5 / Issue 4, November 2017, pp 411 - 440
doi: 10.1017/nws.2017.23

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Modelling the effects of phylogeny and body size on within-host pathogen replication and immune response

Understanding how quickly pathogens replicate and how quickly the immune system responds is important for predicting the epidemic spread of emerging pathogens. Host body size, through its correlation with metabolic rates, is theoretically predicted to impact pathogen replication rates and immune system response rates. Here, we use mathematical models of viral time courses from multiple species of birds infected by a generalist pathogen (West Nile Virus; WNV) to test more thoroughly how disease progression and immune response depend on mass and host phylogeny. We use hierarchical Bayesian models coupled with nonlinear dynamical models of disease dynamics to incorporate the hierarchical nature of host phylogeny. Our analysis suggests an important role for both host phylogeny and species mass in determining factors important for viral spread such as the basic reproductive number, WNV production rate, peak viraemia in blood and competency of a host to infect mosquitoes. Our model is based on a principled analysis and gives a quantitative prediction for key epidemiological determinants and how they vary with species mass and phylogeny. This leads to new hypotheses about the mechanisms that cause certain taxonomic groups to have higher viraemia. For example, our models suggest that higher viral burst sizes cause corvids to have higher levels of viraemia and that the cellular rate of virus production is lower in larger species. We derive a metric of competency of a host to infect disease vectors and thereby sustain the disease between hosts. This suggests that smaller passerine species are highly competent at spreading the disease compared with larger non-passerine species. Our models lend mechanistic insight into why some species (smaller passerine species) are pathogen reservoirs and some (larger non-passerine species) are potentially dead-end hosts for WNV. Our techniques give insights into the role of body mass and host phylogeny in the spread of WNV and potentially other zoonotic diseases. The major contribution of this work is a computational framework for infectious disease modelling at the within-host level that leverages data from multiple species. This is likely to be of interest to modellers of infectious diseases that jump species barriers and infect multiple species. Our method can be used to computationally determine the competency of a host to infect mosquitoes that will sustain WNV and other zoonotic diseases. We find that smaller passerine species are more competent in spreading the disease than larger non-passerine species. This suggests the role of host phylogeny as an important determinant of within-host pathogen replication. Ultimately, we view our work as an important step in linking within-host viral dynamics models to between-host models that determine spread of infectious disease between different hosts.

 

Modelling the effects of phylogeny and body size on within-host pathogen replication and immune response
Soumya Banerjee, Alan S. Perelson, Melanie Moses
Published 15 November 2017.DOI: 10.1098/rsif.2017.0479

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Analytical framework for the study of epidemic models on activity driven networks

Network theory has greatly contributed to an improved understanding of epidemic processes, offering an empowering framework for the analysis of real-world data, prediction of disease outbreaks, and formulation of containment strategies. However, the current state of knowledge largely relies on time-invariant networks, which are not adequate to capture several key features of a number of infectious diseases. Activity driven networks (ADNs) constitute a promising modelling framework to describe epidemic spreading over time varying networks, but a number of technical and theoretical gaps remain open. Here, we lay the foundations for a novel theory to model general epidemic spreading processes over time-varying, ADNs. Our theory derives a continuous-time model, based on ordinary differential equations (ODEs), which can reproduce the dynamics of any discrete-time epidemic model evolving over an ADN. A rigorous, formal framework is developed, so that a general epidemic process can be systematically mapped, at first, on a Markov jump process, and then, in the thermodynamic limit, on a system of ODEs. The obtained ODEs can be integrated to simulate the system dynamics, instead of using computationally intensive Monte Carlo simulations. An array of mathematical tools for the analysis of the proposed model is offered, together with techniques to approximate and predict the dynamics of the epidemic spreading, from its inception to the endemic equilibrium. The theoretical framework is illustrated step-by-step through the analysis of a susceptible–infected–susceptible process. Once the framework is established, applications to more complex epidemic models are presented, along with numerical results that corroborate the validity of our approach. Our framework is expected to find application in the study of a number of critical phenomena, including behavioural changes due to the infection, unconscious spread of the disease by exposed individuals, or the removal of nodes from the network of contacts.

 

An analytical framework for the study of epidemic models on activity driven networks
Lorenzo Zino Alessandro Rizzo Maurizio Porfiri
Journal of Complex Networks, cnx056, https://doi.org/10.1093/comnet/cnx056

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Curtailing cascading failures

Cascading behaviors are ubiquitous, from power-grid failures (1) to “flash crashes” in financial markets (2, 3) to the spread of political movements such as the “Arab Spring” (4). The causes of these cascades are varied with many unknowns, which make them extremely difficult to predict or contain. Particularly challenging are cascading failures that arise from the reorganization of flows on a network, such as in electric power grids, supply chains, and transportation networks. Here, the network edges (or “links”) have some fixed capacity, and we see that some small disturbances easily dampen out, but other seemingly similar ones lead to massive failures. On page 886 of this issue, Yang et al. (5) establish that a small “vulnerable set” of components in the power grid is implicated in large-scale outages. Although the exact elements in this set vary with operating conditions, they reveal intriguing correlations with network structure.

 

 Curtailing cascading failures
Raissa M. D'Souza

Science  17 Nov 2017:
Vol. 358, Issue 6365, pp. 860-861
DOI: 10.1126/science.aaq0474

more...
No comment yet.