Papers
422.0K views | +8 today
Follow
 
Scooped by Complexity Digest
onto Papers
Scoop.it!

Lethal weapons and the evolution of civilisation

IT'S about 2 metres long, made of tough spruce wood and carved into a sharp point at one end. The widest part, and hence its centre of gravity, is in the front third, suggesting it was thrown like a javelin. At 400,000 years old, this is the world's oldest spear. And, according to a provocative theory, on its carved length rests nothing less than the foundation of human civilisation as we know it, including democracy, class divisions and the modern nation state.

more...
No comment yet.
Papers
Recent publications related to complex systems
Your new post is loading...
Your new post is loading...
Scooped by Complexity Digest
Scoop.it!

Multiscale Information Theory and the Marginal Utility of Information

Complex systems display behavior at a range of scales. Large-scale behaviors can emerge from the correlated or dependent behavior of individual small-scale components. To capture this observation in a rigorous and general way, we introduce a formalism for multiscale information theory. Dependent behavior among system components results in overlapping or shared information. A system’s structure is revealed in the sharing of information across the system’s dependencies, each of which has an associated scale. Counting information according to its scale yields the quantity of scale-weighted information, which is conserved when a system is reorganized. In the interest of flexibility we allow information to be quantified using any function that satisfies two basic axioms. Shannon information and vector space dimension are examples. We discuss two quantitative indices that summarize system structure: an existing index, the complexity profile, and a new index, the marginal utility of information. Using simple examples, we show how these indices capture the multiscale structure of complex systems in a quantitative way.

 

Multiscale Information Theory and the Marginal Utility of Information
Benjamin Allen, Blake C. Stacey, and Yaneer Bar-Yam

Entropy 2017, 19(6), 273; doi:10.3390/e19060273

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Zika virus evolution and spread in the Americas

One hundred and ten Zika virus genomes from ten countries and territories involved in the Zika virus epidemic reveal rapid expansion of the epidemic within Brazil and multiple introductions to other regions.

 

Zika virus evolution and spread in the Americas
Hayden C. Metsky, et al.

Nature 546, 411–415 (15 June 2017) doi:10.1038/nature22402

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Collective benefits in traffic during mega events via the use of information technologies

Information technologies today can inform each of us about the route with the shortest time, but they do not contain incentives to manage travellers such that we all get collective benefits in travel times. To that end we need travel demand estimates and target strategies to reduce the traffic volume from the congested roads during peak hours in a feasible way. During large events, the traffic inconveniences in large cities are unusually high, yet temporary, and the entire population may be more willing to adopt collective recommendations for collective benefits in traffic. In this paper, we integrate, for the first time, big data resources to estimate the impact of events on traffic and propose target strategies for collective good at the urban scale. In the context of the Olympic Games in Rio de Janeiro, we first predict the expected increase in traffic. To that end, we integrate data from mobile phones, Airbnb, Waze and transit information, with game schedules and expected attendance in each venue. Next, we evaluate different route choice scenarios for drivers during the peak hours. Finally, we gather information on the trips that contribute the most to the global congestion which could be redirected from vehicles to transit. Interestingly, we show that (i) following new route alternatives during the event with individual shortest times can save more collective travel time than keeping the routine routes used before the event, uncovering the positive value of information technologies during events; (ii) with only a small proportion of people selected from specific areas switching from driving to public transport, the collective travel time can be reduced to a great extent. Results are presented online for evaluation by the public and policymakers

 

Collective benefits in traffic during mega events via the use of information technologies
Yanyan Xu, Marta C. González
Published 12 April 2017.DOI: 10.1098/rsif.2016.1041

Royal Society Interface

April 2017
Volume 14, issue 129

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Efficient method for estimating the number of communities in a network

While there exist a wide range of effective methods for community detection in networks, most of them require one to know in advance how many communities one is looking for. Here we present a method for estimating the number of communities in a network using a combination of Bayesian inference with a novel prior and an efficient Monte Carlo sampling scheme. We test the method extensively on both real and computer-generated networks, showing that it performs accurately and consistently, even in cases where groups are widely varying in size or structure.

 

Efficient method for estimating the number of communities in a network
Maria A. Riolo, George T. Cantwell, Gesine Reinert, M. E. J. Newman

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

A Theory of Reality as More Than the Sum of Its Parts

A Theory of Reality as More Than the Sum of Its Parts | Papers | Scoop.it
New math shows how, contrary to conventional scientific wisdom, conscious beings and other macroscopic entities might have greater influence over the future than does the sum of their microscopic components.
more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

A generalized model of social and biological contagion

We present a model of contagion that unifies and generalizes existing models of the spread of social influences and micro-organismal infections. Our model incorporates individual memory of exposure to a contagious entity (e.g., a rumor or disease), variable magnitudes of exposure (dose sizes), and heterogeneity in the susceptibility of individuals. Through analysis and simulation, we examine in detail the case where individuals may recover from an infection and then immediately become susceptible again (analogous to the so-called SIS model). We identify three basic classes of contagion models which we call \textit{epidemic threshold}, \textit{vanishing critical mass}, and \textit{critical mass} classes, where each class of models corresponds to different strategies for prevention or facilitation. We find that the conditions for a particular contagion model to belong to one of the these three classes depend only on memory length and the probabilities of being infected by one and two exposures respectively. These parameters are in principle measurable for real contagious influences or entities, thus yielding empirical implications for our model. We also study the case where individuals attain permanent immunity once recovered, finding that epidemics inevitably die out but may be surprisingly persistent when individuals possess memory.

 

A generalized model of social and biological contagion
Peter Sheridan Dodds, Duncan J. Watts

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Crowdsourcing the Robin Hood effect in cities

Socioeconomic inequalities in cities are embedded in space and result in neighborhood effects, whose harmful consequences have proved very hard to counterbalance efficiently by planning policies alone. Considering redistribution of money flows as a first step toward improved spatial equity, we study a bottom-up approach that would rely on a slight evolution of shopping mobility practices. Building on a database of anonymized card transactions in Madrid and Barcelona, we quantify the mobility effort required to reach a reference situation where commercial income is evenly shared among neighborhoods. The redirections of shopping trips preserve key properties of human mobility, including travel distances. Surprisingly, for both cities only a small fraction (5%) of trips need to be modified to reach equality situations, improving even other sustainability indicators. The method could be implemented in mobile applications that would assist individuals in reshaping their shopping practices, to promote the spatial redistribution of opportunities in the city.

 

Crowdsourcing the Robin Hood effect in cities
Thomas Louail, Maxime Lenormand, Juan Murillo Arias and José J. Ramasco
Applied Network Science 2017 2:11
DOI: 10.1007/s41109-017-0026-3

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

The placement of the head that maximizes predictability. An information theoretic approach

The minimization of the length of syntactic dependencies is a well-stablished principle of word order and the basis of a mathematical theory of word order. Here we complete that theory from the perspective of information theory, adding a competing word order principle: the maximization of predictability of a target element. These two principles are in conflict: to maximize the predictability of the head, the head should appear last, which maximizes the costs with respect to dependency length minimization. The implications of such a broad theoretical framework to understand the optimality, diversity and evolution of the six possible orderings of subject, object and verb are reviewed.

 

The placement of the head that maximizes predictability. An information theoretic approach
Ramon Ferrer-i-Cancho

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Complexity research in Nature Communications

This web collection showcases the potential of interdisciplinary complexity research by bringing together a selection of recent Nature Communications articles investigating complex systems. Complexity research aims to characterize and understand the behaviour and nature of systems made up of many interacting elements. Such efforts often require interdisciplinary collaboration and expertise from diverse schools of thought. Nature Communications publishes papers across a broad range of topics that span the physical and life sciences, making the journal an ideal home for interdisciplinary studies.
more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Collective navigation of complex networks: Participatory greedy routing

Collective navigation of complex networks: Participatory greedy routing | Papers | Scoop.it

Many networks are used to transfer information or goods, in other words, they are navigated. The larger the network, the more difficult it is to navigate efficiently. Indeed, information routing in the Internet faces serious scalability problems due to its rapid growth, recently accelerated by the rise of the Internet of Things. Large networks like the Internet can be navigated efficiently if nodes, or agents, actively forward information based on hidden maps underlying these systems. However, in reality most agents will deny to forward messages, which has a cost, and navigation is impossible. Can we design appropriate incentives that lead to participation and global navigability? Here, we present an evolutionary game where agents share the value generated by successful delivery of information or goods. We show that global navigability can emerge, but its complete breakdown is possible as well. Furthermore, we show that the system tends to self-organize into local clusters of agents who participate in the navigation. This organizational principle can be exploited to favor the emergence of global navigability in the system.

 

Collective navigation of complex networks: Participatory greedy routing
Kaj-Kolja Kleineberg & Dirk Helbing
Scientific Reports 7, Article number: 2897 (2017)
doi:10.1038/s41598-017-02910-x

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Global Reset: Upgrading Society in the Digital Age

The America-dominated era had industrialized the world, and created previously unseen levels of luxury. It also created a financial industry to make it happen, and a digital infrastructure to watch and control the world. Yet, it has failed to solve the existential challenges of our planet: climate change, environmental destruction, resource depletion. This lack of sustainability is causing wars, mass migration, and a future heading for disaster. A new approach – one that brings people and nature in balance – is urgently needed.
more...
No comment yet.
Rescooped by Complexity Digest from Statistical Physics of Ecological Systems
Scoop.it!

Looplessness in networks is linked to trophic coherence

Complex systems such as cells, brains, or ecosystems are made up of many interconnected elements, each one acting on its neighbors, and sometimes influencing its own state via feedback loops. Certain biological networks have surprisingly few such loops. Although this may be advantageous in various ways, it is not known how feedback is suppressed. We show that trophic coherence, a structural property of ecosystems, is key to the extent of feedback in these as well as in many other systems, including networks related to genes, neurons, metabolites, words, computers, and trading nations. We derive mathematical expressions that provide a benchmark against which to examine empirical data, and conclude that “looplessness” in nature is probably a consequence of trophic coherenc

Via Samir
more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Deliberative Self-Organizing Traffic Lights with Elementary Cellular Automata

Self-organizing traffic lights have shown considerable improvements compared to traditional methods in computer simulations. Self-organizing methods, however, use sophisticated sensors, increasing their cost and limiting their deployment. We propose a novel approach using simple sensors to achieve self-organizing traffic light coordination. The proposed approach involves placing a computer and a presence sensor at the beginning of each block; each such sensor detects a single vehicle. Each computer builds a virtual environment simulating vehicle movement to predict arrivals and departures at the downstream intersection. At each intersection, a computer receives information across a data network from the computers of the neighboring blocks and runs a self-organizing method to control traffic lights. Our simulations showed a superior performance for our approach compared with a traditional method (a green wave) and a similar performance (close to optimal) compared with a self-organizing method using sophisticated sensors but at a lower cost. Moreover, the developed sensing approach exhibited greater robustness against sensor failures.

 

Zapotecatl, J. L., Rosenblueth, D. A., and Gershenson, C. (2017). Deliberative self-organizing traffic lights with elementary cellular automata. Complexity, 2017:7691370.

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

How to fight corruption

Anticorruption initiatives are often put forth as solutions to problems of waste and inefficiency in government programs. It's easy to see why. So often, somewhere along the chain that links the many participants in public service provision or other government activities, funds may get stolen or misdirected, bribes exchanged for preferential treatment, or genuine consumers of public services supplemented by “ghost” users. As a result, corruption reduces economic growth and leaves citizens disillusioned and distrustful of government. It is tempting to think that more monitoring, stricter sanctions, or positive inducements for suitable behavior will reduce corruption. But every anticorruption or antifraud program elicits a strategic response by those who orchestrated and benefited from wrongdoing in the first place. How can these unintended consequences be anticipated and avoided?

 

How to fight corruption
Raymond Fisman, Miriam Golden

Science  26 May 2017:
Vol. 356, Issue 6340, pp. 803-804
DOI: 10.1126/science.aan081

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

The Human Microbiome and the Missing Heritability Problem

The “missing heritability” problem states that genetic variants in Genome-Wide Association Studies (GWAS) cannot completely explain the heritability of complex traits. Traditionally, the heritability of a phenotype is measured through familial studies using twins, siblings and other close relatives, making assumptions on the genetic similarities between them. When this heritability is compared to the one obtained through GWAS for the same traits, a substantial gap between both measurements arise with genome wide studies reporting significantly smaller values. Several mechanisms for this “missing heritability” have been proposed, such as epigenetics, epistasis, and sequencing depth. However, none of them are able to fully account for this gap in heritability. In this paper we provide evidence that suggests that in order for the phenotypic heritability of human traits to be broadly understood and accounted for, the compositional and functional diversity of the human microbiome must be taken into account. This hypothesis is based on several observations: (A) The composition of the human microbiome is associated with many important traits, including obesity, cancer, and neurological disorders. (B) Our microbiome encodes a second genome with nearly a 100 times more genes than the human genome, and this second genome may act as a rich source of genetic variation and phenotypic plasticity. (C) Human genotypes interact with the composition and structure of our microbiome, but cannot by themselves explain microbial variation. (D) Microbial genetic composition can be strongly influenced by the host's behavior, its environment or by vertical and horizontal transmissions from other hosts. Therefore, genetic similarities assumed in familial studies may cause overestimations of heritability values. We also propose a method that allows the compositional and functional diversity of our microbiome to be incorporated to genome wide association studies.

 

The Human Microbiome and the Missing Heritability Problem

Santiago Sandoval-Motta, Maximino Aldana, Esperanza Martínez-Romero and Alejandro Frank

Front. Genet., 13 June 2017 | https://doi.org/10.3389/fgene.2017.00080

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Trilobite ‘pelotons’: possible hydrodynamic drag effects between leading and following trilobites in trilobite queues

Energy saving mechanisms in nature allow following organisms to expend less energy than leaders. Queues, or ordered rows of individuals, may form when organisms exploit the available energy saving mechanism while travelling at near-maximal sustainable metabolic capacities; compact clusters form when group members travel well below maximal sustainable metabolic capacities. The group size range, given here as the ratio of the difference between the size of the largest and smallest group members, and the size of the largest member (as a percentage), has been hypothesized to correspond proportionately to the energy saving quantity because weaker, smaller, individuals sustain the speeds of stronger, larger, individuals by exploiting the energy saving mechanism (as a percentage). During migration, small individuals outside this range may perish, or form sub-groups, or simply not participate in migratory behaviour. We approximate drag forces for leading and following individuals in queues of the late Devonian (c. 370 Ma) trilobite Trimerocephalus chopini. Applying data from literature on Rectisura herculea, a living crustacean, we approximate the hypothetical walking speed and maximal sustainable speeds for T. chopini. Our findings reasonably support the hypothesis that among the population of fossilized queues of T. chopini reported in the literature, trilobite size range was 75%, while the size range within queues was 63%; this corresponds reasonably with drag reductions in following positions that permit c. 61.5% energy saving for trilobites following others in optimal low-drag positions. We model collective trilobite behaviour associated with hydrodynamic drafting.

 

Trilobite ‘pelotons’: possible hydrodynamic drag effects between leading and following trilobites in trilobite queues
Hugh Trenchard, Carlton E. Brett, Matjaž Perc

Palaeontology

Volume 60, Issue 4
July 2017
Pages 557–569

10.1111/pala.12301

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Emergent Network Modularity

We introduce a network growth model based on complete redirection: a new node randomly selects an existing target node, but attaches to a random neighbor of this target. For undirected networks, this simple growth rule generates unusual, highly modular networks. Individual network realizations typically contain multiple macrohubs---nodes whose degree scales linearly with the number of nodes N. The size of the network "nucleus"---the set of nodes of degree greater than one---grows sublinearly with N and thus constitutes a vanishingly small fraction of the network. The network therefore consists almost entirely of leaves (nodes of degree one) as N.

 

Emergent Network Modularity
P. L. Krapivsky, S. Redner

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

The Self-Organizing Society: A Grower's Guide

Can a human society be constrained in such a way that self-organization will thereafter tend to produce outcomes that advance the goals of the society? Such a society would be self-organizing in the sense that individuals who pursue only their own interests would none-the-less act in the interests of the society as a whole, irrespective of any intention to do so. This paper identifies the conditions that must be met if such a self-organizing society is to emerge. It demonstrates that the key enabling requirement for a self-organizing society is consequence-capture. Broadly this means that all agents in the society must capture sufficient of the benefits (and harms) that are produced by their actions on the goals of the society. Consequence-capture can be organized in a society by appropriate management (systems of evolvable constraints) that suppresses free riders and supports pro-social actions. In human societies these constraints include institutions such as systems of governance and social norms. The paper identifies ways of organizing societies so that effective governance will also self-organize. This will produce a fully self-organizing society in which the interests of all agents (including individuals, associations, firms, multi-national corporations, political organizations, institutions and governments) are aligned with the interests of the society as a whole.

 

The Self-Organizing Society: A Grower's Guide
John E. Stewart

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

The Self-Organization of Dragon Kings

Surprisingly common outliers of a distribution tail, known as Dragon Kings, are seen in many complex systems. It has been argued that the general conditions for Dragon Kings in self-organized systems are high system coupling and low heterogeneity. In this Letter, we introduce a novel mechanism of Dragon Kings by discussing two closely-related stylized models of cascading failures. Although the first variant (based on simple contagion spreading and inoculation) exhibits well-studied self-organized criticality, the second one (based on both simple and complex contagion spreading) creates self-organized Dragon Kings in the failure size distribution. Next, we begin to understand the mechanistic origin of these Dragon Kings by mapping the probability of an initial cascade to a generalized birthday problem, which helps demonstrate that the Dragon King cascade is due to initial failures whose size exceeds a threshold that is infinitesimal compared to the size of the network. We use this finding to predict the onset of Dragon Kings with high accuracy using only logistic regression. Finally, we devise a simple control strategy that can decrease the frequency of Dragon Kings by orders of magnitude. We conclude with remarks on the applicability of both models to natural and engineered systems.

 

The Self-Organization of Dragon Kings
Yuansheng Lin, Keith Burghardt, Martin Rohden, Pierre-André Noël, Raissa M. D'Souza

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Bitcoin ecology: Quantifying and modelling the long-term dynamics of the cryptocurrency market

The cryptocurrency market has reached a record of $91 billion market capitalization in May 2017, after months of steady growth. Despite its increasing relevance in the financial world, however, a comprehensive analysis of the whole system is still lacking, as most studies have focused exclusively on the behavior of one (Bitcoin) or few cryptocurrencies. Here, we consider the history of the entire market and analyze the behavior of 1, 469 cryptocurrencies introduced since April 2013. We reveal that, while new cryptocurrencies appear and disappear continuously and their market capitalization is increasing exponentially, several statistical properties of the market have been stable for years. These include the number of active cryptocurrencies, the market share distribution and the turnover of cryptocurrencies. Adopting an ecological perspective, we show that the so-called neutral model of evolution is able to reproduce a number of key empirical observations, despite its simplicity and the assumption of no selective advantage of one cryptocurrency over another. Our results shed light on the properties of the cryptocurrency market and establish a first formal link between ecological modeling and the study of this growing system. We anticipate they will spark further research in this direction.

 

Bitcoin ecology: Quantifying and modelling the long-term dynamics of the cryptocurrency market
Abeer ElBahrawy, Laura Alessandretti, Anne Kandler, Romualdo Pastor-Satorras, Andrea Baronchelli

more...
No comment yet.
Scooped by Complexity Digest
Scoop.it!

Multiplex model of mental lexicon reveals explosive learning in humans

Similarities among words affect language acquisition and processing in a multi-relational way barely accounted for in the literature. We propose a multiplex network representation of word similarities in a mental lexicon as a natural framework for investigating large-scale cognitive patterns. Our model accounts for semantic, taxonomic, and phonological interactions and identifies a cluster of words of higher frequency, easier to identify, memorise and learn and with more meanings than expected at random. This cluster emerges around age 7 yr through an explosive transition not reproduced by null models. We relate this phenomenon to polysemy, i.e. redundancy in word meanings. We show that the word cluster acts as a core for the lexicon, increasing both its navigability and robustness to degradation in cognitive impairments. Our findings provide quantitative confirmation of existing psycholinguistic conjectures about core structure in the mental lexicon and the importance of integrating multi-relational word-word interactions in suitable frameworks.

 

Multiplex model of mental lexicon reveals explosive learning in humans
Massimo Stella, Nicole M. Beckage, Markus Brede, Manlio De Domenico

more...
No comment yet.
Rescooped by Complexity Digest from Sustainable Complex Coevolutionary Systems Engineering
Scoop.it!

Extended spider cognition

There is a tension between the conception of cognition as a central nervous system (CNS) process and a view of cognition as extending towards the body or the contiguous environment. The centralised conception requires large or complex nervous systems to cope with complex environments. Conversely, the extended conception involves the outsourcing of information processing to the body or environment, thus making fewer demands on the processing power of the CNS. The evolution of extended cognition should be particularly favoured among small, generalist predators such as spiders, and here, we review the literature to evaluate the fit of empirical data with these contrasting models of cognition. Spiders do not seem to be cognitively limited, displaying a large diversity of learning processes, from habituation to contextual learning, including a sense of numerosity. To tease apart the central from the extended cognition, we apply the mutual manipulability criterion, testing the existence of reciprocal causal links between the putative elements of the system. We conclude that the web threads and configurations are integral parts of the cognitive systems. The extension of cognition to the web helps to explain some puzzling features of spider behaviour and seems to promote evolvability within the group, enhancing innovation through cognitive connectivity to variable habitat features. Graded changes in relative brain size could also be explained by outsourcing information processing to environmental features. More generally, niche-constructed structures emerge as prime candidates for extending animal cognition, generating the selective pressures that help to shape the evolving cognitive system.

 

Extended spider cognition

Japyassú, H.F. & Laland, K.N. Anim Cogn (2017) 20: 375. doi:10.1007/s10071-017-1069-7


Via Dr Alejandro Martinez-Garcia
more...
Dr Alejandro Martinez-Garcia's curator insight, June 7, 3:20 PM
Clark's idea of 'extended phenotype' does not only apply to humans...
 
Scooped by Complexity Digest
Scoop.it!

Model of the best-of-$N$ nest-site selection process in honeybees

Bees are smart, anybody knows that, but swarms are smarter. They have the ability to choose the best dwelling place among a set of potential nest sites with different qualities. Signalling serves the bees to convince their mates to choose the same site they have visited, and to prevent that other bees recruit to different sites. Our latest study proposes that the frequency of signalling is a key parameter: on the one hand, scarce signalling among bees hampers the attainment of consensus within the swarm; on the other hand, too frequent signalling reduces decision accuracy by quickly committing to early-discovered inferior quality options. The optimal signalling frequency lies in the middle. We suggest that the ability of bees to fine-tune their communication frequencies helps them to master their house-hunting task. Hence, this study hypothesises how ecological factors determining the density of suitable nest sites may have led to selective pressures for the evolution of an optimal stable signalling frequency. It also indicates a possible signalling strategy of honeybees: starting with few signals and gradually increasing the signalling frequency through time, until convergence is reached. In addition, our results may lead to the implementation of better algorithms for distributed decision making, to be employed in sensor networks or robot swarms.

 

A. Reina, J.A.R. Marshall, V. Trianni, T. Bose. Model of the best-of-N nest-site selection process in honeybees. Physical Review E, 95(5): 052411, 2017.
URL: http://link.aps.org/doi/10.1103/PhysRevE.95.052411

more...
No comment yet.
Rescooped by Complexity Digest from Statistical Physics of Ecological Systems
Scoop.it!

Biodiversity @Nature

The tremendous diversity of life on Earth — a result of more than three billion years of evolutionary history — is facing an uncertain future. This Insight looks at how this biodiversity came to be, how it supports the goods and ecosystem services on which we depend and how it is being put to the test by the rapidly expanding human population. Crucially, strategies to safeguard this diversity are explored.

Via Samir
more...
chao pan's curator insight, June 15, 6:51 PM

enable students to explore the biodiversity around them

Suggested by mohsen mosleh
Scoop.it!

Fair Topologies: Community Structures and Network Hubs Drive Emergence of Fairness Norms

Fair Topologies: Community Structures and Network Hubs Drive Emergence of Fairness Norms | Papers | Scoop.it

Fairness has long been argued to govern human behavior in a wide range of social, economic, and organizational activities. The sense of fairness, although universal, varies across different societies. In this study, using a computational model, we test the hypothesis that the topology of social interaction can causally explain some of the cross-societal variations in fairness norms. We show that two network parameters, namely, community structure, as measured by the modularity index, and network hubiness, represented by the skewness of degree distribution, have the most significant impact on emergence of collective fair behavior. These two parameters can explain much of the variations in fairness norms across societies and can also be linked to hypotheses suggested by earlier empirical studies in social and organizational sciences. We devised a multi-layered model that combines local agent interactions with social learning, thus enables both strategic behavior as well as diffusion of successful strategies. By applying multivariate statistics on the results, we obtain the relation between network structural features and the collective fair behavior.

 

Fair Topologies: Community Structures and Network Hubs Drive Emergence of Fairness Norms

Mohsen Mosleh, Babak Heydari

more...
No comment yet.