Papers
Follow
Find
346.6K views | +847 today
Papers
Recent publications related to complex systems
Your new post is loading...
Your new post is loading...
Rescooped by Complexity Digest from Social Foraging
Scoop.it!

Collective Learning and Optimal Consensus Decisions in Social Animal Groups

Learning has been studied extensively in the context of isolated individuals. However, many organisms are social and consequently make decisions both individually and as part of a collective. Reaching consensus necessarily means that a single option is chosen by the group, even when there are dissenting opinions. This decision-making process decouples the otherwise direct relationship between animals' preferences and their experiences (the outcomes of decisions). Instead, because an individual's learned preferences influence what others experience, and therefore learn about, collective decisions couple the learning processes between social organisms. This introduces a new, and previously unexplored, dynamical relationship between preference, action, experience and learning. Here we model collective learning within animal groups that make consensus decisions. We reveal how learning as part of a collective results in behavior that is fundamentally different from that learned in isolation, allowing grouping organisms to spontaneously (and indirectly) detect correlations between group members' observations of environmental cues, adjust strategy as a function of changing group size (even if that group size is not known to the individual), and achieve a decision accuracy that is very close to that which is provably optimal, regardless of environmental contingencies. Because these properties make minimal cognitive demands on individuals, collective learning, and the capabilities it affords, may be widespread among group-living organisms. Our work emphasizes the importance and need for theoretical and experimental work that considers the mechanism and consequences of learning in a social context.

 

Kao AB, Miller N, Torney C, Hartnett A, Couzin ID (2014) Collective Learning and Optimal Consensus Decisions in Social Animal Groups. PLoS Comput Biol 10(8): e1003762. http://dx.doi.org/10.1371/journal.pcbi.1003762


Via Ashish Umre
more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

A method for building self-folding machines

Origami can turn a sheet of paper into complex three-dimensional shapes, and similar folding techniques can produce structures and mechanisms. To demonstrate the application of these techniques to the fabrication of machines, we developed a crawling robot that folds itself. The robot starts as a flat sheet with embedded electronics, and transforms autonomously into a functional machine. To accomplish this, we developed shape-memory composites that fold themselves along embedded hinges. We used these composites to recreate fundamental folded patterns, derived from computational origami, that can be extrapolated to a wide range of geometries and mechanisms. This origami-inspired robot can fold itself in 4 minutes and walk away without human intervention, demonstrating the potential both for complex self-folding machines and autonomous, self-controlled assembly.


A method for building self-folding machines
S. Felton, M. Tolley, E. Demaine, D. Rus, R. Wood

Science 8 August 2014:
Vol. 345 no. 6197 pp. 644-646
http://dx.doi.org/10.1126/science.1252610

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Cloudy With a Chance of War

Cloudy With a Chance of War | Papers | Scoop.it

He then sorted his “deadly quarrels” the way geologists classify earthquakes, ranking each “quarrel” according to the base-10 logarithm of the number of deaths it produced. The base-10 logarithm of a number describes how many times 10 must be multiplied to produce that number. A riot that leaves 100 dead in this system has a magnitude of 2 (the base—10—must be multiplied by itself to yield 100). And a conflict that kills 10 million people has a magnitude of 7 (multiplying seven tens will yield 10 million). Defining “deadly quarrels” on a logarithmic scale also served Richardson’s project to get people thinking about violence without illusion. Like the Richter scale for earthquakes, his logarithmic graphs let the reader see all quarrels, from murders to global war, as a single phenomenon on a single scale.


http://nautil.us/issue/15/turbulence/cloudy-with-a-chance-of-war

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

The impact of network structure on innovation efficiency: An agent-based study in the context of innovation networks

This article investigates the impact of network structure on innovation efficiency by establishing a simulation model of innovation process in the context of innovation networks. The results indicate that short path lengths between vertices are conductive to high efficiency of explorative innovations, dense clusters are conductive to high efficiency of exploitative innovations, and high small-worldness is conductive to high efficiency of the hybrid of these two innovations. Moreover, we discussed the reason of the results and give some suggestions to innovators and innovation policy makers.


The impact of network structure on innovation efficiency: An agent-based study in the context of innovation networks
Lei Hua and Wenping Wang

Complexity

http://dx.doi.org/10.1002/cplx.21583

more...
No comment yet.
Suggested by Walter Quattrociocchi
Scoop.it!

Science vs Conspiracy: collective narratives in the age of (mis)information

The large availability of user provided contents on online social media facilitates people aggregation around common interests, worldviews and narratives. However, in spite of the enthusiastic rhetoric about the so called {\em wisdom of crowds}, unsubstantiated rumors -- as alternative explanation to main stream versions of complex phenomena -- find on the Web a natural medium for their dissemination. In this work we study, on a sample of 1.2 million of individuals, how information related to very distinct narratives -- i.e. main stream scientific and alternative news -- are consumed on Facebook. Through a thorough quantitative analysis, we show that distinct communities with similar information consumption patterns emerge around distinctive narratives. Moreover, consumers of alternative news (mainly conspiracy theories) result to be more focused on their contents, while scientific news consumers are more prone to comment on alternative news. We conclude our analysis testing the response of this social system to 4709 troll information -- i.e. parodistic imitation of alternative and conspiracy theories. We find that, despite the false and satirical vein of news, usual consumers of conspiracy news are the most prone to interact with them.


Science vs Conspiracy: collective narratives in the age of (mis)information
Alessandro Bessi, Mauro Coletto, George Alexandru Davidescu, Antonio Scala, Guido Caldarelli, Walter Quattrociocchi

http://arxiv-web3.library.cornell.edu/abs/1408.1667

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Charting culture

This animation distils hundreds of years of culture into just five minutes. A team of historians and scientists wanted to map cultural mobility, so they tracked the births and deaths of notable individuals like David, King of Israel, and Leonardo da Vinci, from 600 BC to the present day. Using them as a proxy for skills and ideas, their map reveals intellectual hotspots and tracks how empires rise and crumble

.


The information comes from Freebase, a Google-owned database of well-known people and places, and other catalogues of notable individuals. The visualization was created by Maximilian Schich (University of Texas at Dallas) and Mauro Martino (IBM).




https://www.youtube.com/watch?v=4gIhRkCcD4U&index=1&list=PL7yuGPz_odjMW3YfSRkFRjoDdGsTrZDyD 

Read Nature's news story: http://www.nature.com/news/1.15650

See Also: http://sco.lt/8by75F 


more...
No comment yet.
Suggested by eflegara
Scoop.it!

Quantifying the semantics of search behavior before stock market moves

Internet search data may offer new possibilities to improve forecasts of collective behavior, if we can identify which parts of these gigantic search datasets are relevant. We introduce an automated method that uses data from Google and Wikipedia to identify relevant topics in search data before large events. Using stock market moves as a case study, our method successfully identifies historical links between searches related to business and politics and subsequent stock market moves. We find that the predictive value of these search terms has recently diminished, potentially reflecting increasing incorporation of Internet data into automated trading strategies. We suggest that extensions of these analyses could help draw links between search data and a range of other collective actions.

 

Quantifying the semantics of search behavior before stock market moves

C. Curmea, T. Preis, H.E. Stanley, and H.S. Moat
http://dx.doi.org/10.1073/pnas.1324054111

more...
No comment yet.
Rescooped by Complexity Digest from Complex World
Scoop.it!

How bird flocks are like liquid helium

How bird flocks are like liquid helium | Papers | Scoop.it

Mathematical model shows how hundreds of starlings coordinate their movements in flight.

A flock of starlings flies as one, a spectacular display in which each bird flits about as if in a well-choreographed dance. Everyone seems to know exactly when and where to turn. Now, for the first time, researchers have measured how that knowledge moves through the flock—a behavior that mirrors certain quantum phenomena of liquid helium.


Via Claudia Mihai
more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

On the structural stability of mutualistic systems

Structural stability has played a major role in several fields such as evolutionary developmental biology, in which it has brought the view that some morphological structures are more common than others because they are compatible with a wider range of developmental conditions. In community ecology, structural stability is the sort of framework needed to study the consequences of global environmental change—by definition, large and directional—on species coexistence. Structural stability will serve to assess both the range of variability a given community can withstand and why some community patterns are more widespread than others.


On the structural stability of mutualistic systems
Rudolf P. Rohr, Serguei Saavedra, Jordi Bascompte

Science 25 July 2014:
Vol. 345 no. 6195
http://dx.doi.org/10.1126/science.1253497

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

From Dyson to Hopfield: Processing on hierarchical networks

We consider statistical-mechanical models for spin systems built on hierarchical structures, which provide a simple example of non-mean-field framework. We show that the coupling decay with spin distance can give rise to peculiar features and phase diagrams much richer that their mean-field counterpart. In particular, we consider the Dyson model, mimicking ferromagnetism in lattices, and we prove the existence of a number of meta-stabilities, beyond the ordered state, which get stable in the thermodynamic limit. Such a feature is retained when the hierarchical structure is coupled with the Hebb rule for learning, hence mimicking the modular architecture of neurons, and gives rise to an associative network able to perform both as a serial processor as well as a parallel processor, depending crucially on the external stimuli and on the rate of interaction decay with distance; however, those emergent multitasking features reduce the network capacity with respect to the mean-field counterpart. The analysis is accomplished through statistical mechanics, graph theory, signal-to-noise technique and numerical simulations in full consistency. Our results shed light on the biological complexity shown by real networks, and suggest future directions for understanding more realistic models.


From Dyson to Hopfield: Processing on hierarchical networks
Elena Agliari, Adriano Barra, Andrea Galluzzi, Francesco Guerra, Daniele Tantari, Flavia Tavani

http://arxiv.org/abs/1407.5019

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

A system‐level model for the microbial regulatory genome

Microbes can tailor transcriptional responses to diverse environmental challenges despite having streamlined genomes and a limited number of regulators. Here, we present data‐driven models that capture the dynamic interplay of the environment and genome‐encoded regulatory programs of two types of prokaryotes: Escherichia coli (a bacterium) and Halobacterium salinarum (an archaeon). The models reveal how the genome‐wide distributions of cis‐acting gene regulatory elements and the conditional influences of transcription factors at each of those elements encode programs for eliciting a wide array of environment‐specific responses. We demonstrate how these programs partition transcriptional regulation of genes within regulons and operons to re‐organize gene–gene functional associations in each environment. The models capture fitness‐relevant co‐regulation by different transcriptional control mechanisms acting across the entire genome, to define a generalized, system‐level organizing principle for prokaryotic gene regulatory networks that goes well beyond existing paradigms of gene regulation. An online resource (http://egrin2.systemsbiology.net) has been developed to facilitate multiscale exploration of conditional gene regulation in the two prokaryotes.


A system‐level model for the microbial regulatory genome
Aaron N Brooks, David J Reiss, Antoine Allard, Wei‐Ju Wu, Diego M Salvanha, Christopher L Plaisier, Sriram Chandrasekaran, Min Pan, Amardeep Kaur, Nitin S Baliga
Mol Syst Biol. (2014) 10: 740

http://dx.doi.org/10.15252/msb.20145160

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Null Models for Community Detection in Spatially-Embedded, Temporal Networks

In the study of networks, it is often insightful to use algorithms to determine mesoscale features such as "community structure", in which densely connected sets of nodes constitute "communities" that have sparse connections to other communities. The most popular way of detecting communities algorithmically is to optimize the quality function known as modularity. When optimizing modularity, one compares the actual connections in a (static or time-dependent) network to the connections obtained from a random-graph ensemble that acts as a null model. The communities are then the sets of nodes that are connected to each other densely relative to what is expected from the null model. Clearly, the process of community detection depends fundamentally on the choice of null model, so it is important to develop and analyze novel null models that take into account appropriate features of the system under study. In this paper, we investigate the effects of using null models that take incorporate spatial information, and we propose a novel null model based on the radiation model of population spread. We also develop novel synthetic spatial benchmark networks in which the connections between entities are based on distance or flux between nodes, and we compare the performance of both static and time-dependent radiation null models to the standard ("Newman-Girvan") null model for modularity optimization and a recently-proposed gravity null model. In our comparisons, we use both the above synthetic benchmarks and time-dependent correlation networks that we construct using countrywide dengue fever incidence data for Peru. We also evaluate a recently-proposed correlation null model, which was developed specifically for correlation networks that are constructed from time series, on the epidemic-correlation data.


Null Models for Community Detection in Spatially-Embedded, Temporal Networks
Marta Sarzynska, Elizabeth A. Leicht, Gerardo Chowell, Mason A. Porter

http://arxiv.org/abs/1407.6297

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Biosemiotic Entropy: Concluding the Series

This article concludes the special issue on Biosemiotic Entropy looking toward the future on the basis of current and prior results. It highlights certain aspects of the series, concerning factors that damage and degenerate biosignaling systems. As in ordinary linguistic discourse, well-formedness (coherence) in biological signaling systems depends on valid representations correctly construed: a series of proofs are presented and generalized to all meaningful sign systems. The proofs show why infants must (as empirical evidence shows they do) proceed through a strict sequence of formal steps in acquiring any language. Classical and contemporary conceptions of entropy and information are deployed showing why factors that interfere with coherence in biological signaling systems are necessary and sufficient causes of disorders, diseases, and mortality. Known sources of such formal degeneracy in living organisms (here termed, biosemiotic entropy) include: (a) toxicants, (b) pathogens; (c) excessive exposures to radiant energy and/or sufficiently powerful electromagnetic fields; (d) traumatic injuries; and (e) interactions between the foregoing factors. Just as Jaynes proved that irreversible changes invariably increase entropy, the theory of true narrative representations (TNR theory) demonstrates that factors disrupting the well-formedness (coherence) of valid representations, all else being held equal, must increase biosemiotic entropy—the kind impacting biosignaling systems.


Biosemiotic Entropy: Concluding the Series
by John W. Oller
Entropy 2014, 16(7), 4060-4087; http://dx.doi.org/10.3390/e16074060

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Limits on fundamental limits to computation

An indispensable part of our personal and working lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the past fifty years. Such Moore scaling now requires ever-increasing efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and increase our understanding of integrated-circuit scaling, here I review fundamental limits to computation in the areas of manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, I recapitulate how some limits were circumvented, and compare loose and tight limits. Engineering difficulties encountered by emerging technologies may indicate yet unknown limits.


Limits on fundamental limits to computation
Igor L. Markov
Nature 512, 147–154 (14 August 2014) http://dx.doi.org/10.1038/nature13570

more...
ComplexInsight's curator insight, August 15, 2014 2:31 AM

Discussion of limits is key to creating new ideas - Igor Markov's paper is worth reading for exploring lmitations and engineering implications and to trigger off new discussions and ideas. Worth reading.

Suggested by Joseph Lizier
Scoop.it!

Self-organization in complex systems as decision making

The idea is advanced that self-organization in complex systems can be treated as decision making (as it is performed by humans) and, vice versa, decision making is nothing but a kind of self-organization in the decision maker nervous systems. A mathematical formulation is suggested based on the definition of probabilities of system states, whose particular cases characterize the probabilities of structures, patterns, scenarios, or prospects. In this general framework, it is shown that the mathematical structures of self-organization and of decision making are identical. This makes it clear how self-organization can be seen as an endogenous decision making process and, reciprocally, decision making occurs via an endogenous self-organization. The approach is illustrated by phase transitions in large statistical systems, crossovers in small statistical systems, evolutions and revolutions in social and biological systems, structural self-organization in dynamical systems, and by the probabilistic formulation of classical and behavioral decision theories. In all these cases, self-organization is described as the process of evaluating the probabilities of macroscopic states or prospects in the search for a state with the largest probability. The general way of deriving the probability measure for classical systems is the principle of minimal information, that is, the conditional entropy maximization under given constraints. Behavioral biases of decision makers can be characterized in the same way as analogous to quantum fluctuations in natural systems

 

Self-organization in complex systems as decision making
V.I. Yukalov, D. Sornette
arXiv:1408.1529, 2014
http://arxiv.org/abs/1408.1529

more...
Eli Levine's curator insight, August 16, 2014 1:28 PM

Basically, the process of decision-making is apart of the system as a whole and not an externality. Where is the clear distinction between a user and the computer program that they choose to run? Can't it all be viewed as one thing? 

 

Amazing implications for governing and government relative to society. 

Scooped by Complexity Digest
Scoop.it!

Online collaboration: Scientists and the social network

Giant academic social networks have taken off to a degree that no one expected even a few years ago. A Nature survey explores why.
more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Crowdsourcing Dialect Characterization through Twitter

We perform a large-scale analysis of language diatopic variation using geotagged microblogging datasets. By collecting all Twitter messages written in Spanish over more than two years, we build a corpus from which a carefully selected list of concepts allows us to characterize Spanish varieties on a global scale. A cluster analysis proves the existence of well defined macroregions sharing common lexical properties. Remarkably enough, we find that Spanish language is split into two superdialects, namely, an urban speech used across major American and Spanish citites and a diverse form that encompasses rural areas and small towns. The latter can be further clustered into smaller varieties with a stronger regional character.


Crowdsourcing Dialect Characterization through Twitter
Bruno Gonçalves, David Sánchez

10.1073/pnas.1407486111

more...
No comment yet.
Suggested by eflegara
Scoop.it!

Competitive Dynamics on Complex Networks

Competitive Dynamics on Complex Networks | Papers | Scoop.it

We consider a dynamical network model in which two competitors have fixed and different states, and each normal agent adjusts its state according to a distributed consensus protocol. The state of each normal agent converges to a steady value which is a convex combination of the competitors' states, and is independent of the initial states of agents. This implies that the competition result is fully determined by the network structure and positions of competitors in the network. We compute an Influence Matrix (IM) in which each element characterizing the influence of an agent on another agent in the network. We use the IM to predict the bias of each normal agent and thus predict which competitor will win. Furthermore, we compare the IM criterion with seven node centrality measures to predict the winner. We find that the competitor with higher Katz Centrality in an undirected network or higher PageRank in a directed network is most likely to be the winner. These findings may shed new light on the role of network structure in competition and to what extent could competitors adjust network structure so as to win the competition.

 

Competitive Dynamics on Complex Networks

Jiuhua Zhao, Qipeng Liu, & Xiaofan Wang
Scientific Reports 4, Article number: 5858
http://dx.doi.org/10.1038/srep05858

 

 

more...
AleksBlumentals's curator insight, August 14, 2014 2:39 AM

How do you discover Caseworthiness? 

 


Tom Cockburn's curator insight, August 14, 2014 10:41 AM

Could be useful

Suggested by Emmanuelle Tognoli
Scoop.it!

The human dynamic clamp as a paradigm for social interaction

The human dynamic clamp (HDC) is proposed as a general paradigm for studies of elementary forms of social behavior in complex biological systems. HDC enables parametric control of real-time bidirectional interaction between humans and empirically grounded theoretical models of coordination dynamics. It thus provides necessary experimental access for laboratory investigations, while preserving the reciprocity and open boundary conditions inherent in daily life social interactions. As proof of concept, different implementations are illustrated, ranging from coordination of rhythmic and discrete movements to adaptive and directed behaviors. The HDC may be a powerful tool for blending theory and experiment at different levels of description, from neuronal populations to cognition and social behavior.


The human dynamic clamp as a paradigm for social interaction
Guillaume Dumas, Gonzalo C. de Guzman, Emmanuelle Tognoli, and J. A. Scott Kelso

http://dx.doi.org/10.1073/pnas.1407486111

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Complexity and the Emergence of Physical Properties

Using the effective complexity measure, proposed by M. Gell-Mann and S. Lloyd, we give a quantitative definition of an emergent property. We use several previous results and properties of this particular information measure closely related to the random features of the entity and its regularities.


Complexity and the Emergence of Physical Properties
Miguel Angel Fuentes

Entropy 2014, 16(8), 4489-4496; http://dx.doi.org/10.3390/e16084489

more...
Costas Bouyioukos's curator insight, August 18, 2014 12:56 PM

Interesting for those who look for emergent properties in biological systems!

Scooped by Complexity Digest
Scoop.it!

A network framework of cultural history

The emergent processes driving cultural history are a product of complex interactions among large numbers of individuals, determined by difficult-to-quantify historical conditions. To characterize these processes, we have reconstructed aggregate intellectual mobility over two millennia through the birth and death locations of more than 150,000 notable individuals. The tools of network and complexity theory were then used to identify characteristic statistical patterns and determine the cultural and historical relevance of deviations. The resulting network of locations provides a macroscopic perspective of cultural history, which helps us to retrace cultural narratives of Europe and North America using large-scale visualization and quantitative dynamical tools and to derive historical trends of cultural centers beyond the scope of specific events or narrow time intervals.


A network framework of cultural historyMaximilian SchichChaoming Song,  Yong-Yeol Ahn,  Alexander MirskyMauro MartinoAlbert-László BarabásiDirk Helbing

Science 1 August 2014: 
Vol. 345 no. 6196 pp. 558-562 
http://dx.doi.org/10.1126/science.1240064

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Stigmergy as a Universal Coordination Mechanism: components, varieties and applications

The concept of stigmergy has been used to analyze self-organizing activities in an ever-widening range of domains, from social insects via robotics and social media to human society. Yet, it is still poorly understood, and as such its full power remains underappreciated. The present paper clarifies the issue by defining stigmergy as a mechanism of indirect coordination in which the trace left by an action in a medium stimulates a subsequent action. It then analyses the fundamental components of the definition: action, agent, medium, trace and coordination. Stigmergy enables complex, coordinated activity without any need for planning, control, communication, simultaneous presence, or even mutual awareness. This makes the concept applicable to a very broad variety of cases, from chemical reactions to individual cognition and Internet-supported collaboration in Wikipedia.  The paper classifies different varieties of stigmergy according to general aspects (number of agents, scope, persistence, sematectonic vs. marker-based, and quantitative vs. qualitative), while emphasizing the fundamental continuity between these cases. This continuity can be understood from a non-linear, self-organizing dynamic that lets more complex forms of coordination evolve out of simpler ones. The paper concludes with two specifically human applications in cognition and cooperation, suggesting that without stigmergy these phenomena may never have evolved.


Heylighen, F. (2015). Stigmergy as a Universal Coordination Mechanism: components, varieties and applications. To appear in T. Lewis & L. Marsh (Eds.), Human Stigmergy: Theoretical Developments and New Applications, Studies in Applied Philosophy, Epistemology and Rational Ethics. Springer.
http://pespmc1.vub.ac.be/papers/stigmergy-varieties.pdf

more...
Tom Cockburn's curator insight, August 3, 2014 3:31 AM

Indirect coordination in self organising

IT's curator insight, August 5, 2014 4:42 PM

To je počteníčko to Vám povim a pak, že tomu nerozumí

Scooped by Complexity Digest
Scoop.it!

Self-organization on social media: endo-exo bursts and baseline fluctuations

A salient dynamic property of social media is bursting behavior. In this paper, we study bursting behavior in terms of the temporal relation between a preceding baseline fluctuation and the successive burst response using a frequency time series of 3,000 keywords on Twitter. We found that there is a fluctuation threshold up to which the burst size increases as the fluctuation increases and that above the threshold, there appears a variety of burst sizes. We call this threshold the critical threshold. Investigating this threshold in relation to endogenous bursts and exogenous bursts based on peak ratio and burst size reveals that the bursts below this threshold are endogenously caused and above this threshold, exogenous bursts emerge. Analysis of the 3,000 keywords shows that all the nouns have both endogenous and exogenous origins of bursts and that each keyword has a critical threshold in the baseline fluctuation value to distinguish between the two. Having a threshold for an input value for activating the system implies that Twitter is an excitable medium. These findings are useful for characterizing how excitable a keyword is on Twitter and could be used, for example, to predict the response to particular information on social media.

Self-organization on social media: endo-exo bursts and baseline fluctuations


Mizuki Oka, Yasuhiro Hashimoto, Takashi Ikegami
http://arxiv.org/abs/1407.6447

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

The Rise of Social Bots

The Turing test asked whether one could recognize the behavior of a human from that of a computer algorithm. Today this question has suddenly become very relevant in the context of social media, where text constraints limit the expressive power of humans, and real incentives abound to develop human-mimicking software agents called social bots. These elusive entities wildly populate social media ecosystems, often going unnoticed among the population of real people. Bots can be benign or harmful, aiming at persuading, smearing, or deceiving. Here we discuss the characteristics of modern, sophisticated social bots, and how their presence can endanger online ecosystems and our society. We then discuss current efforts aimed at detection of social bots in Twitter. Characteristics related to content, network, sentiment, and temporal patterns of activity are imitated by bots but at the same time can help discriminate synthetic behaviors from human ones, yielding signatures of engineered social tampering.


The Rise of Social Bots
Emilio Ferrara, Onur Varol, Clayton Davis, Filippo Menczer, Alessandro Flammini

http://arxiv.org/abs/1407.5225

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Limits of Predictability in Commuting Flows in the Absence of Data for Calibration

The estimation of commuting flows at different spatial scales is a fundamental problem for different areas of study. Many current methods rely on parameters requiring calibration from empirical trip volumes. Their values are often not generalizable to cases without calibration data. To solve this problem we develop a statistical expression to calculate commuting trips with a quantitative functional form to estimate the model parameter when empirical trip data is not available. We calculate commuting trip volumes at scales from within a city to an entire country, introducing a scaling parameter alpha to the recently proposed parameter free radiation model. The model requires only widely available population and facility density distributions. The parameter can be interpreted as the influence of the region scale and the degree of heterogeneity in the facility distribution. We explore in detail the scaling limitations of this problem, namely under which conditions the proposed model can be applied without trip data for calibration. On the other hand, when empirical trip data is available, we show that the proposed model's estimation accuracy is as good as other existing models. We validated the model in different regions in the U.S., then successfully applied it in three different countries.


Limits of Predictability in Commuting Flows in the Absence of Data for Calibration
Yingxiang Yang, Carlos Herrera, Nathan Eagle, Marta C. Gonzalez

http://arxiv.org/abs/1407.6256

more...
No comment yet.