Your new post is loading...
Your new post is loading...
Complex Adaptive Systems Modeling welcomes submissions to the new thematic series on Modeling largescale communication networks using complex networks and agentbased modeling techniques. This thematic series intends to publish high quality original research as well as review articles on case studies, models and methods for the modeling and simulation of largescale computer communication networks using either of the following two approaches: Complex networks (such as modeled using tools such as Gephi, Network Workbench and others) Agentbased models (such as based on NetLogo, Repast, Mason, Swarm and others) Potential topics include, but are not limited to: Multiagent systemsCognitive Sensor NetworksWireless Sensor NetworksSensor Actuator NetworksCloud computing infrastructuresInternet of ThingsServiceoriented architecturesPervasive/Mobile ComputingPeertopeer networks http://www.casmodeling.com/about/update/COMM_NETS ;
Via Complexity Digest
Recently much attention has been paid to the study of the robustness of interdependent and multiplex networks and, in particular, the networks of networks. The robustness of interdependent networks can be evaluated by the size of a mutually connected component when a fraction of nodes have been removed from these networks. Here we characterize the emergence of the mutually connected component in a network of networks in which every node of a network (layer) alpha is connected with q_alpha its randomly chosen replicas in some other networks and is interdependent of these nodes with probability r. We find that when the superdegrees q_alpha of different layers in a network of networks are distributed heterogeneously, multiple percolation phase transition can occur. We show that, depending on the value of r, these transition are continuous or discontinuous.
Via Claudia Mihai
Transfer entropy is a recently introduced informationtheoretic measure quantifying directed statistical coherence between spatiotemporal processes, and is widely used in diverse fields ranging from finance to neuroscience. However, its relationships to fundamental limits of computation, such as Landauer's limit, remain unknown. Here we show that in order to increase transfer entropy (predictability) by one bit, heat flow must match or exceed Landauer's limit. Importantly, we generalise Landauer's limit to bidirectional information dynamics for nonequilibrium processes, revealing that the limit applies to prediction, in addition to retrodiction (information erasure). Furthermore, the results are related to negentropy, and to Bremermann's limit and the Bekenstein bound, producing, perhaps surprisingly, lower bounds on the computational deceleration and information loss incurred during an increase in predictability about the process. The identified relationships set new computational limits in terms of fundamental physical quantities, and establish transfer entropy as a central measure connecting information theory, thermodynamics and theory of computation. Transfer Entropy and Transient Limits of Computation Mikhail Prokopenko and Joseph T. Lizier Scientific Reports 4, 5394, doi:10.1038/srep05394 http://www.nature.com/srep/2014/140623/srep05394/full/srep05394.html
Via Complexity Digest
The intelligence phenomenon continues to fascinate scientists and engineers, remaining an elusive moving target. Following numerous past observations (e.g., Hofstadter, 1985, p. 585), it can be pointed out that several attempts to construct “artificial intelligence” have turned to designing programs with discriminative power. These programs would allow computers to discern between meaningful and meaningless in similar ways to how humans perform this task. Interestingly, as noted by de Looze (2006) among others, such discrimination is based on etymology of “intellect” derived from Latin “intellego” (interlego): to choose between, or to perceive/read (a core message) between (alternatives). In terms of computational intelligence, the ability to read between the lines, extracting some new essence, corresponds to mechanisms capable of generating computational novelty and choice, coupled with active perception, learning, prediction, and postdiction. When a robot demonstrates a stable control in presence of a priori unknown environmental perturbations, it exhibits intelligence. When a software agent generates and learns new behaviors in a selforganizing rather than a predefined way, it seems to be curiositydriven. When an algorithm rapidly solves a hard computational problem, by efficiently exploring its searchspace, it appears intelligent. Prokopenko M (2014) Grand challenges for computational intelligence. Front. Robot. AI 1:2. http://journal.frontiersin.org/Journal/10.3389/frobt.2014.00002/full
Via Complexity Digest
The availability of human genome sequence has transformed biomedical research over the past decade. However, an equivalent map for the human proteome with direct measurements of proteins and peptides does not exist yet. Here we present a draft map of the human proteome using highresolution Fouriertransform mass spectrometry. Indepth proteomic profiling of 30 histologically normal human samples, including 17 adult tissues, 7 fetal tissues and 6 purified primary haematopoietic cells, resulted in identification of proteins encoded by 17,294 genes accounting for approximately 84% of the total annotated proteincoding genes in humans. A unique and comprehensive strategy for proteogenomic analysis enabled us to discover a number of novel proteincoding regions, which includes translated pseudogenes, noncoding RNAs and upstream open reading frames. This large human proteome catalogue (available as an interactive webbased resource at http://www.humanproteomemap.org ) will complement available human genome and transcriptome data to accelerate biomedical research in health and disease. A draft map of the human proteome • MinSik Kim, et al. Nature 509, 575–581 (29 May 2014) http://dx.doi.org/10.1038/nature13302
Via Complexity Digest
It is commonly believed that information spreads between individuals like a pathogen, with each exposure by an informed friend potentially resulting in a naive individual becoming infected. However, empirical studies of social media suggest that individual response to repeated exposure to information is far more complex. As a proxy for intervention experiments, we compare user responses to multiple exposures on two different social media sites, Twitter and Digg. We show that the position of exposing messages on the userinterface strongly affects social contagion. Accounting for this visibility significantly simplifies the dynamics of social contagion. The likelihood an individual will spread information increases monotonically with exposure, while explicit feedback about how many friends have previously spread it increases the likelihood of a response. We provide a framework for unifying information visibility, divided attention, and explicit social feedback to predict the temporal dynamics of user behavior.
Via Shaolin Tan
Time plays an essential role in the diffusion of information, influence, and disease over networks. In many cases we can only observe when a node is activated by a contagion—when a node learns about a piece of information, makes a decision, adopts a new behavior, or becomes infected with a disease. However, the underlying network connectivity and transmission rates between nodes are unknown. Inferring the underlying diffusion dynamics is important because it leads to new insights and enables forecasting, as well as influencing or containing information propagation. In this paper we model diffusion as a continuous temporal process occurring at different rates over a latent, unobserved network that may change over time. Given information diffusion data, we infer the edges and dynamics of the underlying network. Our model naturally imposes sparse solutions and requires no parameter tuning. We develop an efficient inference algorithm that uses stochastic convex optimization to compute online estimates of the edges and transmission rates. We evaluate our method by tracking information diffusion among 3.3 million mainstream media sites and blogs, and experiment with more than 179 million different instances of information spreading over the network in a oneyear period. We apply our network inference algorithm to the top 5,000 media sites and blogs and report several interesting observations. First, information pathways for general recurrent topics are more stable across time than for ongoing news events. Second, clusters of news media sites and blogs often emerge and vanish in a matter of days for ongoing news events. Finally, major events, for example, large scale civil unrest as in the Libyan civil war or Syrian uprising, increase the number of information pathways among blogs, and also increase the network centrality of blogs and social media sites. Uncovering the structure and temporal dynamics of information propagation MANUEL GOMEZ RODRIGUEZ, JURE LESKOVEC, DAVID BALDUZZI, BERNHARD SCHÖLKOPF Network Science , Volume 2 , Issue 01 , April 2014, pp 26  65 http://dx.doi.org/10.1017/nws.2014.3 ;
Via Complexity Digest, Shaolin Tan
We have often observed unexpected state transitions of complex systems. We are thus interested in how to steer a complex system from an unexpected state to a desired state. Here we introduce the concept of transittability of complex networks, and derive a new sufficient and necessary condition for state transittability which can be efficiently verified. We define the steering kernel as a minimal set of steering nodes to which control signals must directly be applied for transition between two specific states of a network, and propose a graphtheoretic algorithm to identify the steering kernel of a network for transition between two specific states. We applied our algorithm to 27 real complex networks, finding that sizes of steering kernels required for transittability are much less than those for complete controllability. Furthermore, applications to regulatory biomolecular networks not only validated our method but also identified the steering kernel for their phenotype transitions.
Via Shaolin Tan
Power grids, road maps, and river streams are examples of infrastructural networks which are highly vulnerable to external perturbations. An abrupt local change of load (voltage, traffic density, or water level) might propagate in a cascading way and affect a significant fraction of the network. Almost discontinuous perturbations can be modeled by shock waves which can eventually interfere constructively and endanger the normal functionality of the infrastructure. We study their dynamics by solving the Burgers equation under random perturbations on several real and artificial directed graphs. Even for graphs with a narrow distribution of node properties (e.g., degree or betweenness), a steady state is reached exhibiting a heterogeneous load distribution, having a difference of one order of magnitude between the highest and average loads. Unexpectedly we find for the European power grid and for finite WattsStrogatz networks a broad pronounced bimodal distribution for the loads. To identify the most vulnerable nodes, we introduce the concept of nodebasin size, a purely topological property which we show to be strongly correlated to the average load of a node.
Via Shaolin Tan
Despite growing interest in quantifying and modeling the scoring dynamics within professional sports games, relative little is known about what patterns or principles, if any, cut across different sports. Using a comprehensive data set of scoring events in nearly a dozen consecutive seasons of college and professional (American) football, professional hockey, and professional basketball, we identify several common patterns in scoring dynamics. Across these sports, scoring tempowhen scoring events occurclosely follows a common Poisson process, with a sportspecific rate. Similarly, scoring balancehow often a team wins an eventfollows a common Bernoulli process, with a parameter that effectively varies with the size of the lead. Combining these processes within a generative model of gameplay, we find they both reproduce the observed dynamics in all four sports and accurately predict game outcomes. These results demonstrate common dynamical patterns underlying withingame scoring dynamics across professional team sports, and suggest specific mechanisms for driving them. We close with a brief discussion of the implications of our results for several popular hypotheses about sports dynamics.
Via Bernard Ryefield
In this paper, we empirically analyze the realworld human movements which are based on GPS records, and observe rich scaling properties in the temporalspatial patterns as well as an abnormal transition in the speeddisplacement patterns together with an evidence to the realworld traffic jams. In addition, we notice that the displacements at the population level show a significant positive correlation, indicating a cascadinglike nature in human movements. Furthermore, our analysis at the individual level finds that the displacement distributions of users with stronger correlations usually are closer to the power law, suggesting a correlation between the positive correlation of the displacement series and the form of an individual's displacement distribution.
Via Bernard Ryefield

Following last years successful edition, we have once more decided to organize a summer school coinciding with the European Conference on Complex Systems thus profiting the opportunity offered by the presence of a wide variety of experts in different topics in Lucca. The projected school aims to offer young researchers the opportunity to learn new methods, present their work and meet fellow researchers, and it also represents a good opportunity for young researcher to prepare their participation to the main ECCS conference in an informal and relaxed environment. Following our policy to display local talent, three renowned italian researchers will each present a different aspect of complex networks in three hour sessions. Names such as Dr. Roberta Sinatra, Dr. Ciro Catutto and Prof. Stefano Battiston should sound familiar to any interested student. Furthermore, we plan a meeting where each participant will have the possibility to share with the others his work, organized as a flash presentation workshop. Of course, a major social event is also included, to stimulate networking and “prepare” the official ECCS conference. http://eccswarmup.wordpress.com
Via Complexity Digest
Core percolation is a fundamental structural transition in complex networks related to a wide range of important problems. Recent advances have provided us an analytical framework of core percolation in uncorrelated random networks with arbitrary degree distributions. Here we apply the tools in analysis of network controllability. We confirm analytically that the emergence of the bifurcation in control coincides with the formation of the core and the structure of the core determines the control mode of the network. We also derive the analytical expression related to the controllability robustness by extending the deduction in core percolation. These findings help us better understand the interesting interplay between the structural and dynamical properties of complex networks.
Via Shaolin Tan
Over the course of human history, thousands of languages have developed from what was once a much smaller number. How did we end up with so many? And how do we keep track of them all? Alex Gendler explains how linguists group languages into language families, demonstrating how these linguistic trees give us crucial insights into the past.
Via Ashish Umre
Human interactions give rise to the formation of different kinds of opinions in a society. The study of formations and dynamics of opinions has been one of the most important areas in social physics.
Via Shaolin Tan
Social networks pervade our everyday lives: we interact, influence, and are influenced by our friends and acquaintances. With the advent of the World Wide Web, large amounts of data on social networks have become available, allowing the quantitative analysis of the distribution of information on them, including behavioral traits and fads. Recent studies of correlations among members of a social network, who exhibit the same trait, have shown that individuals influence not only their direct contacts but also friends’ friends, up to a network distance extending beyond their closest peers. Here, we show how such patterns of correlations between peers emerge in networked populations. We use standard models (yet reflecting intrinsically different mechanisms) of information spreading to argue that empirically observed patterns of correlation among peers emerge naturally from a wide range of dynamics, being essentially independent of the type of information, on how it spreads, and even on the class of underlying network that interconnects individuals. Finally, we show that the sparser and clustered the network, the more far reaching the influence of each individual will be. DOI: http://dx.doi.org/10.1103/PhysRevLett.112.098702 Origin of Peer Influence in Social Networks Phys. Rev. Lett. 112, 098702 – Published 6 March 2014 Flávio L. Pinheiro, Marta D. Santos, Francisco C. Santos, and Jorge M. Pacheco
Via Complexity Digest, Shaolin Tan
The compartmental models used to study epidemic spreading often assume the same susceptibility for all individuals, and are therefore, agnostic about the effects that differences in susceptibility can have on epidemic spreading. Here we show thatfor the SIS modeldifferential susceptibility can make networks more vulnerable to the spread of diseases when the correlation between a node's degree and susceptibility are positive, and less vulnerable when this correlation is negative. Moreover, we show that networks become more likely to contain a pocket of infection when individuals are more likely to connect with others that have similar susceptibility (the network is segregated). These results show that the failure to include differential susceptibility to epidemic models can lead to a systematic over/under estimation of fundamental epidemic parameters when the structure of the networks is not independent from the susceptibility of the nodes or when there are correlations between the susceptibility of connected individuals.
Via Shaolin Tan
Recently, the impact of network structure on evolutionary dynamics has been at the center of attention when studying the evolutionary process of structured populations. This paper aims at finding out the key structural feature of network to capture its impact on evolutionary dynamics. To this end, a novel concept called heat heterogeneity is introduced to characterize the structural heterogeneity of network, and the correlation between heat heterogeneity of structure and outcome of evolutionary dynamics is further investigated on various networks. It is found that the heat heterogeneity mainly determines the impact of network structure on evolutionary dynamics on complex networks. In detail, the heat heterogeneity readjusts the selection effect on evolutionary dynamics. Networks with high heat heterogeneity amplify the selection effect on the birthdeath process and suppress the selection effect on the deathbirth process. Based on the above results, an effective algorithm is proposed to generate selection adjusters with desired size and average degree.
Via Shaolin Tan
While studies of aging are widely framed in terms of their demarcation of degenerative processes, the brain provides a unique opportunity to uncover the adaptive effects of getting older. Though intuitively reasonable, that lifeexperience and wisdom should reside somewhere in human cortex, these features have eluded neuroscientific explanation. The present study utilizes a “Bayesian Brain” framework to motivate an analysis of cortical circuit processing. From a Bayesian perspective, the brain represents a model of its environment and offers predictions about the world, while responding, through changing synaptic strengths to novel interactions and experiences. We hypothesized that these predictive and updating processes are modified as we age, representing an optimization of neuronal architecture. Using novel sensory stimuli we demonstrate that synaptic connections of older brains resist trial by trial learning to provide a robust model of their sensory environment. These older brains are capable of processing a wider range of sensory inputs – representing experienced generalists. We thus explain how, contrary to a singularly degenerative pointofview, aging neurobiological effects may be understood, in sanguine terms, as adaptive and useful. Moran RJ, Symmonds M, Dolan RJ, Friston KJ (2014) The Brain Ages Optimally to Model Its Environment: Evidence from Sensory Learning over the Adult Lifespan. PLoS Comput Biol 10(1): e1003422. http://dx.doi.org/10.1371/journal.pcbi.1003422
Via Complexity Digest
This paper reports theoretical and experimental studies on spatiotemporal dynamics in the choruses of male Japanese tree frogs. First, we theoretically model their calling times and positions as a system of coupled mobile oscillators.
Via Bernard Ryefield
We explain how specific dynamical properties give rise to the limit distribution of sums of deterministic variables at the transition to chaos via the perioddoubling route. We study the sums of successive positions generatedby an ensemble of initial conditions uniformly distributed in the entire phase space of a unimodal map as represented by the logistic map. We find that these sums acquire their salient, multiscale, features from the repellor preimage structure that dominates the dynamics toward the attractors along the perioddoubling cascade. And we explain how these properties transmit from the sums to their distribution. Specifically, we show how the stationary distribution of sums of positions at the Feigebaum point is built up from those associated with the supercycle attractors forming a hierarchical structure with multifractal and discrete scale invariance properties.
Via Bernard Ryefield

The first question that then needs to be answered is: What is a complex adaptive system? David Krakauer defines complex systems as “systems that don’t yield compact forms of representation”1. In other words a complex system cannot be described by a simple set of equations. Why would this be the case? As Krakauer notes, it is the “adaptive” nature of these systems that leads to this intractability. Agents within the system respond to each set of environmental conditions within a complex adaptive system with a different set of responses and the number of such environments and their corresponding agent responses that need to be accounted for to construct an accurate model of the system is simply too large. But is this simply a problem of impracticality? Could we, at least in theory, construct a model that takes into account all possible environmental conditions and all possible agent behaviours? Although some scientists may argue that such an approach is theoretically possible, there is ample evidence that the critical “adaptive” component of some complex adaptive systems may in fact be unmodelable. There is no better example of this than the problems faced by the economist Hyman Minsky in formalising many of his most important ideas.
Interesting to read!
Another way to think about environmental influences at the source of tensions at work (as opposed to individual lack of employee performance or motivational problems). Are these 'tensions' actually symptoms of a system out of balance? Are these tensions the real gems for organisational steering? Check out how Holacracy capitalises just on that!