Most individuals in social networks experience a so-called Friendship Paradox: they are less popular than their friends on average. This effect may explain recent findings that widespread social network media use leads to reduced happiness. However the relation between popularity and happiness is poorly understood. A Friendship paradox does not necessarily imply a Happiness paradox where most individuals are less happy than their friends. Here we report the first direct observation of a significant Happiness Paradox in a large-scale online social network of 39,110 Twitter users. Our results reveal that popular individuals are indeed happier and that a majority of individuals experience a significant Happiness paradox. The magnitude of the latter effect is shaped by complex interactions between individual popularity, happiness, and the fact that users cluster assortatively by level of happiness. Our results indicate that the topology of online social networks and the distribution of happiness in some populations can cause widespread psycho-social effects that affect the well-being of billions of individuals.
The happiness paradox: your friends are happier than you Johan Bollen, Bruno Gonçalves, Ingrid van de Leemput, Guangchen Ruan
The prevalence of cooperation among human societies is a puzzle that has caught the eye of researchers from multiple fields. Why is that people are selfless and often incur costs to aid others? Reputations are intimately linked with the answer to this question, and so are the social norms that dictate what is reckoned as a good or a bad action. Here we present a mathematical framework to analyze the relationship between different social norms and the sustainability of cooperation, in populations of arbitrary sizes. Indeed, it is known that cooperation, norms, reciprocity and the art of managing reputations, are features that go along with humans from their pre-historic existence in small-scale societies to the contemporary times, when technology supports the interaction with a large number of people. We show that population size is relevant when evaluating the merits of each social norm and conclude that there is a social norm especially effective in leveraging cooperation in small populations. That simple norm dictates that only whoever cooperates with good individuals, and defects against bad ones, deserves a good reputation.
Infomercialist and pop psychologist Barbara De Angelis puts it this way: “Love is a force more formidable than any other.” Whether you agree with her or not, De Angelis is doing something we do all the time—she is using the language of physics to describe social phenomena.
“I was irresistibly attracted to him”; “You can’t force me”; “We recognize the force of public opinion”; “I’m repelled by these policies.” We can’t measure any of these “social forces” in the way that we can measure gravity or magnetic force. But not only has physics-based thinking entered our language, it is also at the heart of many of our most important models of social behavior, from economics to psychology. The question is, do we want it there?
Nowadays, scientific challenges usually require approaches that cross traditional boundaries between academic disciplines, driving many researchers towards interdisciplinarity. Despite its obvious importance, there is a lack of studies on how to quantify the influence of interdisciplinarity on the research impact, posing uncertainty in a proper evaluation for hiring and funding purposes. Here we propose a method based on the analysis of bipartite interconnected multilayer networks of citations and disciplines, to assess scholars, institutions and countries interdisciplinary importance. Using data about physics publications and US patents, we show that our method allows to reveal, using a quantitative approach, that being more interdisciplinary causes -- in the Granger sense -- benefits in scientific productivity and impact. The proposed method could be used by funding agencies, universities and scientific policy decision makers for hiring and funding purposes, and to complement existing methods to rank universities and countries.
Evaluating the impact of interdisciplinary research: a multilayer network approach Elisa Omodei, Manlio De Domenico, Alex Arenas
A new technique outlined in the paper below allows one to use mutual information not just to measure overall dependence, but the exact fraction of that dependence which is linear (or any other fitted function). Therefore, rather than describing dependence as 'linear' or 'nonlinear', the relatively strength of each in dependence can be measured.
A mutual information approach to calculating nonlinearity Reginald D. Smith
The aim of the present study is to provide a picture for geopolitical globalization: the role of all world countries together with their contribution towards globalization is highlighted. In the context of the present study, every country owes its efficiency and therefore its contribution towards structuring the world by the position it holds in a complex global network. The location in which a country is positioned on the network is shown to provide a measure of its "contribution" and "importance". As a matter of fact, the visa status conditions between countries reflect their contribution towards geopolitical globalization. Based on the visa status of all countries, community detection reveals the existence of 4+1 main communities. The community constituted by the developed countries has the highest clustering coefficient equal to 0.9. In contrast, the community constituted by the old eastern European blocks, the middle eastern countries, and the old Soviet Union has the lowest clustering coefficient approximately equal to 0.65. PR China is the exceptional case. Thus, the picture of the globe issued in this study contributes towards understanding "how the world works".
How visas shape and make visible the geopolitical architecture of the planet Meghdad Saeedian, Tayeb Jamali, S. Vasheghani Farahani, G. R. Jafari, Marcel Ausloos
Almost 100 years ago today, Albert Einstein predicted the existence of gravitational waves — ripples in the fabric of space-time that are set off by extremely violent, cosmic cataclysms in the early universe. With his knowledge of the universe and the technology available in 1916, Einstein assumed that such ripples would be “vanishingly small” and nearly impossible to detect. The astronomical discoveries and technological advances over the past century have changed those prospects. Now for the first time, scientists in the LIGO Scientific Collaboration — with a prominent role played by researchers at MIT and Caltech — have directly observed the ripples of gravitational waves in an instrument on Earth. In so doing, they have again dramatically confirmed Einstein’s theory of general relativity and opened up a new way in which to view the universe.
The first issue of Chaos, published in July of 1991, comprised a selection of 14 now-classic papers authored by leading researchers in nonlinear dynamics.1–14 While some of their distinguished authors—including Vladimir Arnold, Boris Chirikov, and George Zaslavsky—are no longer with us, many of the contributors to the first issue remain active in research and some—Irving Epstein and Leon Glass—are in fact authors of papers in this 25th anniversary issue.
Introduction to Focus Issue: The 25th Anniversary of Chaos: Perspectives on Nonlinear Science—Past, Present, and Future Elizabeth Bradley, Adilson E. Motter and Louis M. Pecora
P-values are widely used in both the social and natural sciences to quantify the statistical significance of observed results. The recent surge of big data research has made p-value an even more popular tool to test the significance of a study. However, substantial literature has been produced critiquing how p-values are used and understood. In this paper we review this recent critical literature, much of which is routed in the life sciences, and consider its implications for social scientific research. We provide a coherent picture of what the main criticisms are, and draw together and disambiguate common themes. In particular, we explain how the False Discovery Rate is calculated, and how this differs from a p-value. We also make explicit the Bayesian nature of many recent criticisms, a dimension that is often underplayed or ignored. We also identify practical steps to help remediate some of the concerns identified, and argue that p-values need to be contextualised within (i) the specific study, and (ii) the broader field of inquiry.
P-values: misunderstood and misused Bertie Vidgen, Taha Yasseri
There is huge amount of content produced online by amateur authors, covering a large variety of topics. Sentiment analysis (SA) extracts and aggregates users’ sentiments towards a target entity. Machine learning (ML) techniques are frequently used as the natural language data is in abundance and has definite patterns. ML techniques adapt to domain specific solution at high accuracy depending upon the feature set used. The lexicon-based techniques, using external dictionary, are independent of data to prevent overfitting but they miss context too in specialized domains. Corpus-based statistical techniques require large data to stabilize. Complex network based techniques are highly resourceful, preserving order, proximity, context and relationships. Recent applications developed incorporate the platform specific structural information i.e. meta-data. New sub-domains are introduced as influence analysis, bias analysis, and data leakage analysis. The nature of data is also evolving where transcribed customer-agent phone conversation are also used for sentiment analysis. This paper reviews sentiment analysis techniques and highlight the need to address natural language processing (NLP) specific open challenges. Without resolving the complex NLP challenges, ML techniques cannot make considerable advancements. The open issues and challenges in the area are discussed, stressing on the need of standard datasets and evaluation methodology. It also emphasized on the need of better language models that could capture context and proximity.
Sentiment analysis and the complex natural language Muhammad Taimoor Khan, Mehr Durrani, Armughan Ali, Irum Inayat, Shehzad Khalid and Kamran Habib Khan
The tensile strength of a chain is determined by its weakest link. Does this idea apply to more complex systems too? For instance, does the weakest thread of a spider web initiate cascading failure, when a strong wind gust is stretching the web to its limit? What happens to a computer when both the supply voltage and the ambient temperature are more than 20% outside its normal range of operations? Climate change, an increasingly more densely populated world and the rapid change of technology seem to put more systems under large stress. Engineering sustainable systems with a more favorable response to large stress appears to be an urgent societal need. Emergency evacuations of hospitals after hurricane Katharina and Sandy, and the May 22, 2011 tornado in Joplin illustrate the urgent need for modeling the adaptive capacity of hospitals during an extended loss of infrastructure . Presidential Policy Directive 21  and the U.S. Department of Homeland Security National Infrastructure Protection Plan (NIPP)  call for increasing resilience of the nation’s critical infrastructure.
System under large stress: Prediction and management of catastrophic failures Alfred Hübler
A computer has beaten a human professional for the first time at Go — an ancient board game that has long been viewed as one of the greatest challenges for artificial intelligence (AI). The best human players of chess, draughts and backgammon have all been outplayed by computers. But a hefty handicap was needed for computers to win at Go. Now Google’s London-based AI company, DeepMind, claims that its machine has mastered the game.
Complexity Digest's insight:
Mastering the game of Go with deep neural networks and tree search David Silver, Aja Huang, Chris J. Maddison, Arthur Guez, Laurent Sifre+ et al.
Insects, whether they creep or fly, live in a world of hard knocks. Who has not stepped on a cockroach, then raised her shoe to watch the creature get up and scoot under a door? Bees and wasps, for their part, face a never-ending obstacle course of leaves, stems, and petals—bumblebees crash their wings into obstacles as often as once a second. Now, researchers are learning how these creatures bend but don't break. The results do more than explain why cockroaches are so hard to kill. By mimicking the combination of rigid and flexible parts that gives insect exoskeletons and wings their resilience, biomechanicists are making robots tougher. It's quite the contrast from the way engineers have designed most of their machines, but may lead to better robots for search and rescue.
Residential mobility is deeply entangled with all aspects of hunter-gatherer life ways, and is therefore an issue of central importance in hunter-gatherer studies. Hunter-gatherers vary widely in annual rates of residential mobility, and understanding the sources of this variation has long been of interest to anthropologists and archaeologists. Since mobility is, to a large extent, driven by the need for a continuous supply of food, a natural framework for addressing this question is provided by the metabolic theory of ecology. This provides a powerful framework for formulating formal testable hypotheses concerning evolutionary and ecological constraints on the scale and variation of hunter-gatherer residential mobility. We evaluate these predictions using extant data and show strong support for the hypotheses. We show that the overall scale of hunter-gatherer residential mobility is predicted by average human body size, and the limited capacity of mobile hunter-gatherers to store energy internally. We then show that the majority of variation in residential mobility observed across cultures is predicted by energy availability in local ecosystems. Our results demonstrate that large-scale evolutionary and ecological processes, common to all plants and animals, constrain hunter-gatherers in predictable ways as they move through territories to effectively exploit resources over the course of a year. Moreover, our results extend the scope of the metabolic theory of ecology by showing how it successfully predicts variation in the behavioral ecology of populations within a species.
The ecological and evolutionary energetics of hunter-gatherer residential mobility Marcus J. Hamilton, Jose Lobo, Eric Rupley, Hyejin Youn, Geoffrey B. West
Next month, the worldwide semiconductor industry will formally acknowledge what has become increasingly obvious to everyone involved: Moore's law, the principle that has powered the information-technology revolution since the 1960s, is nearing its end. A rule of thumb that has come to dominate computing, Moore's law states that the number of transistors on a microprocessor chip will double every two years or so — which has generally meant that the chip's performance will, too. The exponential improvement that the law describes transformed the first crude home computers of the 1970s into the sophisticated machines of the 1980s and 1990s, and from there gave rise to high-speed Internet, smartphones and the wired-up cars, refrigerators and thermostats that are becoming prevalent today.
The chips are down for Moore’s law M. Mitchell Waldrop
Many car-following models are developed for jam avoidance in highways. Two mechanisms are used to improve the stability: feedback control with autonomous models and increasing of the interaction within cooperative ones. In this paper, we compare the linear autonomous and collective optimal velocity (OV) models. We observe that the stability is significantly increased by adding predecessors in interaction with collective models. Yet autonomous and collective approaches are close when the speed difference term is taking into account. Within the linear OV models tested, the autonomous models including speed difference are sufficient to maximise the stability.
Jam avoidance with autonomous systems Antoine Tordeux, Sylvain Lassarre
Money is central in US politics, and most campaign contributions stem from a tiny, wealthy elite. Like other political acts, campaign donations are known to be socially contagious. We study how campaign donations diffuse through a network of more than 50 000 elites and examine how connectivity among previous donors reinforces contagion. We find the diffusion of donations to be driven by independent reinforcement contagion: people are more likely to donate when exposed to donors from different social groups than when they are exposed to equally many donors from the same group. Counter-intuitively, being exposed to one side may increase donations to the other side. Although the effect is weak, simultaneous cross-cutting exposure makes donation somewhat less likely. Finally, the independence of donors in the beginning of a campaign predicts the amount of money that is raised throughout a campaign. We theorize that people infer population-wide estimates from their local observations, with elites assessing the viability of candidates, possibly opposing candidates in response to local support. Our findings suggest that theories of complex contagions need refinement and that political campaigns should target multiple communities.
Complex Contagion of Campaign Donations V.A. Traag
Sharing economy platforms have become extremely popular in the last few years, and they have changed the way in which we commute, travel, and borrow among many other activities. Despite their popularity among consumers, such companies are poorly regulated. For example, Airbnb, one of the most successful examples of sharing economy platform, is often criticized by regulators and policy makers. While, in theory, municipalities should regulate the emergence of Airbnb through evidence-based policy making, in practice, they engage in a false dichotomy: some municipalities allow the business without imposing any regulation, while others ban it altogether. That is because there is no evidence upon which to draft policies. Here we propose to gather evidence from the Web. After crawling Airbnb data for the entire city of London, we find out where and when Airbnb listings are offered and, by matching such listing information with census and hotel data, we determine the socio-economic conditions of the areas that actually benefit from the hospitality platform. The reality is more nuanced than one would expect, and it has changed over the years. Airbnb demand and offering have changed over time, and traditional regulations have not been able to respond to those changes. That is why, finally, we rely on our data analysis to envision regulations that are responsive to real-time demands, contributing to the emerging idea of "algorithmic regulation".
Who Benefits from the "Sharing" Economy of Airbnb? Giovanni Quattrone, Davide Proserpio, Daniele Quercia, Licia Capra, Mirco Musolesi
New types of robots inspired by biological principles of assembly, locomotion, and behavior have been recently described. In this work we explored the concept of robots that are based on more fundamental physical phenomena, such as fluid dynamics, and their potential capabilities. We report a robot made entirely of non-Newtonian fluid, driven by shear strains created by spatial patterns of audio waves. We demonstrate various robotic primitives such as locomotion and transport of metallic loads—up to 6-fold heavier than the robot itself—between points on a surface, splitting and merging, shapeshifting, percolation through gratings, and counting to 3. We also utilized interactions between multiple robots carrying chemical loads to drive a bulk chemical synthesis reaction. Free of constraints such as skin or obligatory structural integrity, fluid robots represent a radically different design that could adapt more easily to unfamiliar, hostile, or chaotic environments and carry out tasks that neither living organisms nor conventional machines are capable of.
Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.
Human impacts on the planet, including anthropogenic climate change, are reshaping ecosystems in unprecedented ways. To meet the challenge of conserving biodiversity in this rapidly changing world, we must understand how ecological assemblages respond to novel conditions (1). However, species in ecosystems are not fixed entities, even without human-induced change. All ecosystems experience natural turnover in species presence and abundance. Taking account of this baseline turnover in conservation planning could play an important role in protecting biodiversity.
In peer recommendation systems, social signals affect item popularity about half as much as position and content do, and further create a "herding" effect that biases people's judgments about the content.
Since its introduction in the 1960s, the theory of innovation diffusion has contributed to the advancement of several research fields, such as marketing management and consumer behavior. The 1969 seminal paper by Bass [F.M. Bass, Manag. Sci. 15, 215 (1969)] introduced a model of product growth for consumer durables, which has been extensively used to predict innovation diffusion across a range of applications. Here, we propose a novel approach to study innovation diffusion, where interactions among individuals are mediated by the dynamics of a time-varying network. Our approach is based on the Bass’ model, and overcomes key limitations of previous studies, which assumed timescale separation between the individual dynamics and the evolution of the connectivity patterns. Thus, we do not hypothesize homogeneous mixing among individuals or the existence of a fixed interaction network. We formulate our approach in the framework of activity driven networks to enable the analysis of the concurrent evolution of the interaction and individual dynamics. Numerical simulations offer a systematic analysis of the model behavior and highlight the role of individual activity on market penetration when targeted advertisement campaigns are designed, or a competition between two different products takes place.
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.