The percentage of human deaths caused by interpersonal violence reflects our membership of a particularly violent clade of mammals, although changes in socio-political organization have led to marked variations in this proportion.
The phylogenetic roots of human lethal violence José María Gómez, Miguel Verdú, Adela González-Megías & Marcos Méndez Nature 538, 233–237 (13 October 2016) doi:10.1038/nature19758
Here we sketch a new derivation of Zipf's law for word frequencies based on optimal coding. The structure of the derivation is reminiscent of Mandelbrot's random typing model but it has multiple advantages over random typing: (1) it starts from realistic cognitive pressures, (2) it does not require fine tuning of parameters, and (3) it sheds light on the origins of other statistical laws of language and thus can lead to a compact theory of linguistic laws. Our findings suggest that the recurrence of Zipf's law in human languages could originate from pressure for easy and fast communication.
Compression and the origins of Zipf's law for word frequencies Ramon Ferrer-i-Cancho
The study of synchronization of coupled systems is currently undergoing a major surge fueled by recent discoveries of new forms of collective dynamics and the development of techniques to characterize a myriad of new patterns of network synchronization. This includes chimera states, phenomena determined by symmetry, remote synchronization, and asymmetry-induced synchronization. This Focus Issue presents a selection of contributions at the forefront of these developments, to which this introduction is intended to offer an up-to-date foundation.
Introduction to focus issue: Patterns of network synchronization Daniel M. Abrams, Louis M. Pecora and Adilson E. Motter
In this work, we are motivated by the observation that previous considerations of appropriate complexity measures have not directly addressed the fundamental issue that the complexity of any particular matter or thing has a significant subjective component in which the degree of complexity depends on available frames of reference. Any attempt to remove subjectivity from a suitable measure therefore fails to address a very significant aspect of complexity. Conversely, there has been justifiable apprehension toward purely subjective complexity measures, simply because they are not verifiable if the frame of reference being applied is in itself both complex and subjective. We address this issue by introducing the concept of subjective simplicity—although a justifiable and verifiable value of subjective complexity may be difficult to assign directly, it is possible to identify in a given context what is “simple” and, from that reference, determine subjective complexity as distance from simple. We then propose a generalized complexity measure that is applicable to any domain, and provide some examples of how the framework can be applied to engineered systems.
Is it possible to constrain a human society in such a way that self-organization will thereafter tend to produce outcomes that advance the goals of the society? Such a society would be self-organizing in the sense that individuals who pursue only their own interests would none-the-less act in the interests of the society as a whole, irrespective of any intention to do so. I sketch an agent-based model that identifies the conditions that must be met if such a self-organizing society is to emerge. The model draws heavily on an understanding of how self-organizing societies have emerged repeatedly during the evolution of life on Earth (e.g. evolution has produced societies of molecular processes, of simple cells, of eukaryote cells and of multicellular organisms). The model demonstrates that the key enabling requirement for a self-organizing society is ‘consequence-capture’. Broadly this means that all agents in the society must capture sufficient of the benefits (and harms) that are produced by the impact of their actions on the goals of the society. If this condition is not met, agents that invest resources in actions that produce societal benefits will tend to be out-competed by those that do not. This ‘consequence-capture’ condition can be met where a society is managed by appropriate systems of evolvable constraints that suppress free riders and support pro-social actions. In human societies these constraints include institutions such as systems of governance and social norms. If a self-organizing society is to emerge, consequence-capture must occur for all agents in the society, including those involved in the establishment and adaptation of institutions. By implementing consequence-capture, appropriate institutions can produce a self-organizing society in which the interests of all agents (including individuals, associations, firms, multi-national corporations, political organizations, institutions and governments) are aligned with those of the society as a whole.
The Self-Organizing Society: The Role of Institutions John E. Stewart
Science-based board games can help people grasp the ecological complexity of autonomous pest control (APC) in the shade-coffee agroecosystem. Azteca Chess is a board-game that captures in a stylized way the fascinating natural history and the dynamics of a complex network of direct, indirect and cascading trait-mediated interactions among five species of arthropods dwelling in shade coffee bushes (a coffee-scale, an ant, an adult and larval lady beetle, a parasitoid wasp and a parasitoid fly). In exchange for honey-dew, the Azteca ant protects scale-insects that help control the devastating coffee-rust disease. The ant repels the adult ladybeetle but inadvertently protects its larvae, which devour scales to local extinction. The head-hunting fly paralyzes Azteca and opens a window of opportunity for the adult beetle to oviposit under scales, but also for a parasitoid wasp to kill the beetle larvae. Interactions can cascade or not towards APC. Experimental test-driving shows Azteca Chess meets good modeling and game-design standards and is proved statistically to enhance understanding and application of relevant complex ecological processes.
Azteca chess: Gamifying a complex ecological process of autonomous pest control in shade coffee Luis García-Barriosa, Ivette Perfecto, John Vandermeer
The history of efforts to reduce ‘human errors’ across workplaces and industries suggests that people (or their weaknesses) are seen as a problem to control [1, 3, 15, 16]. However, some have proposed that humans can be heroes as they can adapt and compensate for weaknesses within a system and direct it away from potential catastrophes . But the existence of heroes would suggest that villains (i.e. humans who cause a disaster) exist as well , and that it might well be the outcome that determines which human becomes which. The purpose of this chapter is to examine if complex socio-technical systems would allow for the existence of heroes and villains, as outcomes in such systems are usually thought to be the product of interactions rather than a single factor . The chapter will first examine if the properties of complex systems as suggested by Dekker et al.  would allow for heroes and villains to exist. These include: (a) synthesis and holism, (b) emergence, (c) foreseeability of probabilities, not certainties, (d) time-irreversibility and, (e) perpetual incompleteness and uncertainty of knowledge, before concluding with a discussion of the implications of the (non) existence of heroes and villains in complex systems for the way we conduct investigations when something goes wrong inside of those systems.
We put high hopes on analyzing big data, but we failed as we haven´t found solutions to the essential problems of our society. Questions like: What is the superior way of organisation of our society in the future or what’s the role of democratic principles in the future? - need to be asked and solved. In the past globalisation, optimization, administration, regulation have served us well and brought us to the level where we are but apparently as the economic situation shows now, we are in a stagnation and all those principles have reached their limits. We need new success principles. ‘I think those success principles are co-creation, co-evolution, collective intelligence, self-organization and self-regulation.’ - says Prof. Dr. Dirk Helbing, Computational Social Science, Department of Humanities, Social and Political Sciences, ETH/Zurich
Faced with effectively unlimited choices of how to spend their time, humans are constantly balancing a trade-off between exploitation of familiar places and exploration of new locations. Previous analyses have shown that at the daily and weekly timescales individuals are well characterized by an activity space of repeatedly visited locations. How this activity space evolves in time, however, remains unexplored. Here we analyse high-resolution spatio-temporal traces from 850 individuals participating in a 24-month experiment. We find that, although activity spaces undergo considerable changes, the number of familiar locations an individual visits at any point in time is a conserved quantity. We show that this number is similar for different individuals, revealing a substantial homogeneity of the observed population. We point out that the observed fixed size of the activity space cannot be explained in terms of time constraints, and is therefore a distinctive property of human behavior.
Evidence for a Conserved Quantity in Human Mobility Laura Alessandretti, Piotr Sapiezynski, Sune Lehmann, Andrea Baronchelli
We show how the success of deep learning depends not only on mathematics but also on physics: although well-known mathematical theorems guarantee that neural networks can approximate arbitrary functions well, the class of functions of practical interest can be approximated through "cheap learning" with exponentially fewer parameters than generic ones, because they have simplifying properties tracing back to the laws of physics. The exceptional simplicity of physics-based functions hinges on properties such as symmetry, locality, compositionality and polynomial log-probability, and we explore how these properties translate into exceptionally simple neural networks approximating both natural phenomena such as images and abstract representations thereof such as drawings. We further argue that when the statistical process generating the data is of a certain hierarchical form prevalent in physics and machine-learning, a deep neural network can be more efficient than a shallow one. We formalize these claims using information theory and discuss the relation to renormalization group procedures. Various "no-flattening theorems" show when these efficient deep networks cannot be accurately approximated by shallow ones without efficiency loss - even for linear networks.
Why does deep and cheap learning work so well? Henry W. Lin, Max Tegmark
We study the dynamic network of real world person-to-person interactions between approximately 1,000 individuals with 5-min resolution across several months. There is currently no coherent theoretical framework for summarizing the tens of thousands of interactions per day in this complex network, but here we show that at the right temporal resolution, social groups can be identified directly. We outline and validate a framework that enables us to study the statistical properties of individual social events as well as series of meetings across weeks and months. Representing the dynamic network as sequences of such meetings reduces the complexity of the system dramatically. We illustrate the usefulness of the framework by investigating the predictability of human social activity.
Fundamental structures of dynamic social networks Vedran Sekara, Arkadiusz Stopczynski, and Sune Lehmann
Almost all processes -- highly correlated, weakly correlated, or correlated not at all---exhibit statistical fluctuations. Often physical laws, such as the Second Law of Thermodynamics, address only typical realizations -- as highlighted by Shannon's asymptotic equipartition property and as entailed by taking the thermodynamic limit of an infinite number of degrees of freedom. Indeed, our interpretations of the functioning of macroscopic thermodynamic cycles are so focused. Using a recently derived Second Law for information processing, we show that different subsets of fluctuations lead to distinct thermodynamic functioning in Maxwellian Demons. For example, while typical realizations may operate as an engine -- converting thermal fluctuations to useful work -- even "nearby" fluctuations (nontypical, but probable realizations) behave differently, as Landauer erasers -- converting available stored energy to dissipate stored information. One concludes that ascribing a single, unique functional modality to a thermodynamic system, especially one on the nanoscale, is at best misleading, likely masking an array of simultaneous, parallel thermodynamic transformations. This alters how we conceive of cellular processes, engineering design, and evolutionary adaptation.
Not All Fluctuations are Created Equal: Spontaneous Variations in Thermodynamic Function James P. Crutchfield, Cina Aghamohammadi
Spontaneous synchronization has long served as a paradigm for behavioral uniformity that can emerge from interactions in complex systems. When the interacting entities are identical and their coupling patterns are also identical, the complete synchronization of the entire network is the state inheriting the system symmetry. As in other systems subject to symmetry breaking, such symmetric states are not always stable. Here we report on the discovery of the converse of symmetry breaking--the scenario in which complete synchronization is not stable for identically-coupled identical oscillators but becomes stable when, and only when, the oscillator parameters are judiciously tuned to nonidentical values, thereby breaking the system symmetry to preserve the state symmetry. Aside from demonstrating that diversity can facilitate and even be required for uniformity and consensus, this suggests a mechanism for convergent forms of pattern formation in which initially asymmetric patterns evolve into symmetric ones.
Symmetric States Requiring System Asymmetry Takashi Nishikawa, Adilson E. Motter
This paper presents a novel model of science funding that exploits the wisdom of the scientific crowd. Each researcher receives an equal, unconditional part of all available science funding on a yearly basis, but is required to individually donate to other scientists a given fraction of all they receive. Science funding thus moves from one scientist to the next in such a way that scientists who receive many donations must also redistribute the most. As the funding circulates through the scientific community it is mathematically expected to converge on a funding distribution favored by the entire scientific community. This is achieved without any proposal submissions or reviews. The model furthermore funds scientists instead of projects, reducing much of the overhead and bias of the present grant peer review system. Model validation using large-scale citation data and funding records over the past 20 years show that the proposed model could yield funding distributions that are similar to those of the NSF and NIH, and the model could potentially be more fair and more equitable. We discuss possible extensions of this approach as well as science policy implications.
An efficient system to fund science: from proposal review to peer-to-peer distributions
Johan Bollen, David Crandall, Damion Junk, Ying Ding, Katy Börner
It is of societal importance to advance the understanding of emerging patterns of biodiversity from biological and ecological systems. The neutral theory offers a statistical-mechanical framework that relates key biological properties at the individual scale with macroecological properties at the community scale. This article surveys the quantitative aspects of neutral theory and its extensions for physicists who are interested in what important problems remain unresolved for studying ecological systems.
The idea of emergence originates from the fact that global effects emerge from local interactions producing a collective coherent behavior. A particular instance of emergence is illustrated by a flocking model of interacting “boids” encompassing two antagonistic conducts—consensus and frustration—giving rise to highly complex, unpredictable, coherent behavior. The cohesive motion arising from consensus can be described in terms of three ordered dynamic phases. Once frustration is included in the model, local phases for specific groups of flockmates, and transitions among them, replace the global ordered phases. Following the evolution of boids in a single group, we discovered that the boids in this group will alternate among the three phases. When we compare two uncorrelated groups, the second group shows a similar behavior to the first one, but with a different sequence of phases. Besides the visual observation of our animations with marked boids, the result is evident plotting the local order parameters. Rather than adopting one of the consensus ordered phases, the flock motion resembles more an entangled dynamic sequence of phase transitions involving each group of flockmates.
A reflection of our ultimate understanding of a complex system is our ability to control its behavior. Typically, control has multiple prerequisites: it requires an accurate map of the network that governs the interactions between the system’s components, a quantitative description of the dynamical laws that govern the temporal behavior of each component, and an ability to influence the state and temporal behavior of a selected subset of the components. With deep roots in dynamical systems and control theory, notions of control and controllability have taken a new life recently in the study of complex networks, inspiring several fundamental questions: What are the control principles of complex systems? How do networks organize themselves to balance control with functionality? To address these questions here recent advances on the controllability and the control of complex networks are reviewed, exploring the intricate interplay between the network topology and dynamical laws. The pertinent mathematical results are matched with empirical findings and applications. Uncovering the control principles of complex systems can help us explore and ultimately understand the fundamental laws that govern their behavior.
Control principles of complex systems Yang-Yu Liu and Albert-László Barabási Rev. Mod. Phys. 88, 035006
The competition for the attention of users is a central element of the Internet. Crucial issues are the origin and predictability of big hits, the few items that capture a big portion of the total attention. We address these issues analyzing 10 million time series of videos' views from YouTube. We find that the average gain of views is linearly proportional to the number of views a video already has, in agreement with usual rich-get-richer mechanisms and Gibrat's law, but this fails to explain the prevalence of big hits. The reason is that the fluctuations around the average views are themselves heavy tailed. Based on these empirical observations, we propose a stochastic differential equation with Le\'evy noise as a model of the dynamics of videos. We show how this model is substantially better in estimating the probability of an ordinary item becoming a big hit, which is considerably underestimated in the traditional proportional-growth models.
Stochastic dynamics and the predictability of big hits in online videos Jose M. Miotto, Hogler Kantz, Eduardo G. Altmann
Invasive mammalian predators are arguably the most damaging group of alien animal species for global biodiversity. Thirty species of invasive predator are implicated in the extinction or endangerment of 738 vertebrate species—collectively contributing to 58% of all bird, mammal, and reptile extinctions. Cats, rodents, dogs, and pigs have the most pervasive impacts, and endemic island faunas are most vulnerable to invasive predators. That most impacted species are insular indicates that management of invasive predators on islands should be a global conservation priority. Understanding and mitigating the impact of invasive mammalian predators is essential for reducing the rate of global biodiversity loss.
The study of social phenomena is becoming increasingly reliant on big data from online social networks. Broad access to social media data, however, requires software development skills that not all researchers possess. Here we present the IUNI Observatory on Social Media, an open analytics platform designed to facilitate computational social science. The system leverages a historical, ongoing collection of over 70 billion public messages from Twitter. We illustrate a number of interactive open-source tools to retrieve, visualize, and analyze derived data from this collection. The Observatory, now available at osome.iuni.iu.edu, is the result of a large, six-year collaborative effort coordinated by the Indiana University Network Science Institute.
Basic research on biodiversity has concentrated on individual species—naming new species, studying distribution patterns, and analyzing their evolutionary relationships. Yet biodiversity is more than a collection of individual species; it is the combination of biological entities and processes that support life on Earth. To understand biodiversity we must catalog it, but we must also assess the ways species interact with other species to provide functional support for the Tree of Life. Ecological interactions may be lost well before the species involved in those interactions go extinct; their ecological functions disappear even though they remain. Here, I address the challenges in studying the functional aspects of species interactions and how basic research is helping us address the fast-paced extinction of species due to human activities.
Recently developed information communication technologies, particularly the Internet, have affected how we, both as individuals and as a society, create, store, and recall information. Internet also provides us with a great opportunity to study memory using transactional large scale data, in a quantitative framework similar to the practice in statistical physics. In this project, we make use of online data by analysing viewership statistics of Wikipedia articles on aircraft crashes. We study the relation between recent events and past events and particularly focus on understanding memory triggering patterns. We devise a quantitative model that explains the flow of viewership from a current event to past events based on similarity in time, geography, topic, and the hyperlink structure of Wikipedia articles. We show that on average the secondary flow of attention to past events generated by such remembering processes is larger than the primary attention flow to the current event. We are the first to report these cascading effects.
Memory Remains: Understanding Collective Memory in the Digital Age Ruth García-Gavilanes, Anders Mollgaard, Milena Tsvetkova, Taha Yasseri
Estimating systemic risk in networks of financial institutions represents, today, a major challenge in both science and financial policy making. This work shows how the increasing complexity of the network of contracts among institutions comes with the price of increasing inaccuracy in the estimation of systemic risk. The paper offers a quantitative method to estimate systemic risk and its accuracy.
The price of complexity in financial networks Stefano Battiston, Guido Caldarelli, Robert M. May, Tarik Roukny, and Joseph E. Stiglitz
Spontaneous synchronization has long served as a paradigm for behavioral uniformity that can emerge from interactions in complex systems. When the interacting entities are identical and their coupling patterns are also identical, the complete synchronization of the entire network is the state inheriting the system symmetry. As in other systems subject to symmetry breaking, such symmetric states are not always stable. Here, we report on the discovery of the converse of symmetry breaking—the scenario in which complete synchronization is not stable for identically coupled identical oscillators but becomes stable when, and only when, the oscillator parameters are judiciously tuned to nonidentical values, thereby breaking the system symmetry to preserve the state symmetry. Aside from demonstrating that diversity can facilitate and even be required for uniformity and consensus, this suggests a mechanism for convergent forms of pattern formation in which initially asymmetric patterns evolve into symmetric ones.
Symmetric States Requiring System Asymmetry Takashi Nishikawa and Adilson E. Motter Phys. Rev. Lett. 117, 114101
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.