Does the availability of instant reference checking and “find more like this” research on the Internet change the standards by which academics should feel “obligated” to cite the work of others? Is the deliberate refusal to look for the existence of parallel work by others an ethical lapse or merely negligence? At a minimum, the Dutch standard of Slodderwetenschap (sloppy science) is clearly at work. At a maximum so is plagiarism. In between sits the process to be labeled as ‘plagiarism by negligence’. This article seeks to expose the intellectual folly of allowing such a plagiarism to be tolerated by the academy through a discussion of the cases of Terrence Deacon and Stephen Wolfram.
Subliminal Influence or Plagiarism by Negligence? The Slodderwetenschap of Ignoring the Internet
The new science of complex systems will be at the heart of the future of the Worldwide Knowledge Society. It is providing radical new ways of understanding the physical, biological, ecological, and techno-social universe. Complex Systems are open, value-laden, multi-level, multi-component, reconfigurable systems of systems, situated in turbulent, unstable, and changing environments. They evolve, adapt and transform through internal and external dynamic interactions. They are the source of very difficult scientific challenges for observing, understanding, reconstructing and predicting their multi-scale dynamics. The challenges posed by the multi-scale modelling of both natural and artificial adaptive complex systems can only be met with radically new collective strategies for research and teaching (...)
The hallmark of deterministic chaos is that it creates information—the rate being given by the Kolmogorov-Sinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system’s intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created information—the ephemeral information—is forgotten and a portion—the bound information—is remembered. The bound information is a new kind of intrinsic computation that differs fundamen- tally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute.
Chaos Forgets and Remembers: Measuring Information Creation, Destruction, and Storage Ryan G. James, Korana Burke, James P. Crutchfield
Networks in the real world do not exist as isolated entities, but they are often part of more complicated structures composed of many interconnected network layers. Recent studies have shown that such mutual dependence makes real networked systems exposed to potentially catastrophic failures, and thus there is a urgent necessity to better understand the mechanisms at the basis of this fragility. The theoretical approach to this problem is based on the study of the nature of the phase transitions associated to critical phenomena running on interconnected networks. In particular, it has been shown that many critical phenomena of continuous nature in isolated networks become instead discontinuous, and thus catastrophic, in multi-layer networks when the strength of the interconnections is sufficiently large. In this paper, we show that four main ingredients determine the critical features of a random interconnected network: the strength of the interconnections, the first two moments of the degree distribution of the entire network, and the correlation between intra- and inter-layer degrees. Different mixtures of these ingredients change the location of the critical points, and lead to the emergence a very rich scenario where phase transitions can be either discontinuous or continuous and different regimes can disappear or even coexist.
Coexistence of critical regimes in interconnected networks Filippo Radicchi
That heat always flows from a hot place to a cold one is a manifestation of the second law of thermodynamics. It is such a part of our everyday experience that to propose the opposite would seem ridiculous. But, that was, in principle, what the Scottish physicist James Clerk Maxwell—one of the giants of physics—suggested in 1867 with his idea of Maxwell’s demon. Maxwell’s thesis was that the second law of thermodynamics has only a statistical certainty. In other words, with sufficiently detailed information about the microscopic motions of individual atoms and molecules, we might be able to separate the fast-moving (“hot”) ones from the slow-moving (“cold”) ones and induce the heat to flow from cold to hot, in apparent contradiction with the second law of thermodynamics. This apparent conundrum, which has caught the attention of many well-known physicists from Lord Kelvin onward, is typically interpreted as the result of an external observer tinkering with the actual microscopic dynamics of a thermodynamic system—the “demon” still alien to many. In this paper, we take a different approach. To the traditional thermodynamic analyses that employ heat and work reservoirs, we add an information reservoir in place of an external observer. The information reservoir captures the physical consequences of Maxwell’s demon, or more precisely, its memory. We then use a single time-independent Hamiltonian to describe a collection of thermodynamic elements, including a device, one or more heat baths, a work reservoir, and an information reservoir (like, for instance, a stream of bits). As a main result, we generalize several classic formulations of the second law of thermodynamics, such as the Kelvin-Planck, Clausius, and Carnot statements, to situations involving information processing. Our paper provides a general conceptual framework for further development and explorations in the cross-disciplinary field of thermodynamic theories of information processing.
Information Processing and the Second Law of Thermodynamics: An Inclusive, Hamiltonian Approach Sebastian Deffner and Christopher Jarzynski
The massive amounts of geolocation data collected from mobile phone records has sparked an ongoing effort to understand and predict the mobility patterns of human beings. In this work, we study the extent to which social phenomena are reflected in mobile phone data, focusing in particular in the cases of urban commute and major sports events. We illustrate how these events are reflected in the data, and show how information about the events can be used to improve predictability in a simple model for a mobile phone user's location.
Human Mobility and Predictability enriched by Social Phenomena Information Nicolas Ponieman, Alejo Salles, Carlos Sarraute
We compared entropy for texts written in natural languages (English, Spanish) and artificial languages (computer software) based on a simple expression for the entropy as a function of message length and specific word diversity. Code text written in artificial languages showed higher entropy than text of similar length expressed in natural languages. Spanish texts exhibit more symbolic diversity than English ones. Results showed that algorithms based on complexity measures differentiate artificial from natural languages, and that text analysis based on complexity measures allows the unveiling of important aspects of their nature. We propose specific expressions to examine entropy related aspects of tests and estimate the values of entropy, emergence, self-organization and complexity based on specific diversity and message length.
Complexity measurement of natural and artificial languages Gerardo Febres, Klaus Jaffe, Carlos Gershenson
Communication depends on reliability. Yet, the existence of stable honest signalling presents an evolutionary puzzle. Why should animals signal honestly in the face of a conflict of interest? This study [...] provides, to our knowledge, the first experimental evidence showing honesty persists when costs are high and disappears when costs are low.
The remarkable ecological and demographic success of humanity is largely attributed to our capacity for cumulative culture1, 2, 3. The accumulation of beneficial cultural innovations across generations is puzzling because transmission events are generally imperfect, although there is large variance in fidelity. Events of perfect cultural transmission and innovations should be more frequent in a large population4. As a consequence, a large population size may be a prerequisite for the evolution of cultural complexity4, 5, although anthropological studies have produced mixed results6, 7, 8, 9 and empirical evidence is lacking10. Here we use a dual-task computer game to show that cultural evolution strongly depends on population size, as players in larger groups maintained higher cultural complexity. We found that when group size increases, cultural knowledge is less deteriorated, improvements to existing cultural traits are more frequent, and cultural trait diversity is maintained more often. Our results demonstrate how changes in group size can generate both adaptive cultural evolution and maladaptive losses of culturally acquired skills. As humans live in habitats for which they are ill-suited without specific cultural adaptations11, 12, it suggests that, in our evolutionary past, group-size reduction may have exposed human societies to significant risks, including societal collapse13.
Experimental evidence for the influence of group size on cultural complexity • Maxime Derex, Marie-Pauline Beugin, Bernard Godelle & Michel Raymond
A key aim in biology and psychology is to identify fundamental principles underpinning the behavior of animals, including humans. Analyses of human language and the behavior of a range of non-human animal species have provided evidence for a common pattern underlying diverse behavioral phenomena: Words follow Zipf's law of brevity (the tendency of more frequently used words to be shorter), and conformity to this general pattern has been seen in the behavior of a number of other animals. It has been argued that the presence of this law is a sign of efficient coding in the information theoretic sense. However, no strong direct connection has been demonstrated between the law and compression, the information theoretic principle of minimizing the expected length of a code. Here, we show that minimizing the expected code length implies that the length of a word cannot increase as its frequency increases. Furthermore, we show that the mean code length or duration is significantly small in human language, and also in the behavior of other species in all cases where agreement with the law of brevity has been found. We argue that compression is a general principle of animal behavior that reflects selection for efficiency of coding.
Compression as a Universal Principle of Animal Behavior
Ramon Ferrer-i-Cancho, Antoni Hernández-Fernández, David Lusseau, Govindasamy Agoramoorthy, Minna J. Hsu and Stuart Semple
The temporal statistics exhibited by written correspondence appear to be media dependent, with features which have so far proven difficult to characterize. We explain the origin of these difficulties by disentangling the role of spontaneous activity from decision-based prioritizing processes in human dynamics, clocking all waiting times through each agent's `proper time' measured by activity. This unveils the same fundamental patterns in written communication across all media (letters, email, sms), with response times displaying truncated power-law behavior and average exponents near -3/2. When standard time is used, the response time probabilities are theoretically predicted to exhibit a bi-modal character, which is empirically borne out by our new years-long data on email. These novel perspectives on the temporal dynamics of human correspondence should aid in the analysis of interaction phenomena in general, including resource management, optimal pricing and routing, information sharing, emergency handling.
Hidden scaling patterns and universality in written communication M. Formentin, A. Lovison, A. Maritan, G. Zanzotto
The dynamics of economies and infectious disease are inexorably linked: economic well-being influences health (sanitation, nutrition, treatment capacity, etc.) and health influences economic well-being (labor productivity lost to sickness and disease). Often societies are locked into ``poverty traps'' of poor health and poor economy. Here, using a simplified coupled disease-economic model with endogenous capital growth we demonstrate the formation of poverty traps, as well as ways to escape them. We suggest two possible mechanisms of escape both motivated by empirical data: one, through an influx of capital (development aid), and another through changing the percentage of GDP spent on healthcare. We find that a large influx of capital is successful in escaping the poverty trap, but increasing health spending alone is not. Our results demonstrate that escape from a poverty trap may be possible, and carry important policy implications in the world-wide distribution of aid and within-country healthcare spending.
Escaping the poverty trap: modeling the interplay between economic growth and the ecology of infectious disease Georg M. Goerg, Oscar Patterson-Lomba, Laurent Hébert-Dufresne, Benjamin M. Althouse
We compare the likelihood of different socially relevant features to allow the evolutionary emergence and maintenance of cooperation in a generalized variant of the iterated Prisoners Dilemma game. Results show that the average costs/benefit balance of cooperation is the primary constraint for its establishment and maintenance. Behavior increasing inclusive fitness such as assortation, homophily, kin-selection and tagging of individuals, is second in importance. Networks characteristics were the least important in favoring the establishment and maintenance of cooperation, despite being the most popular in recent research on the subject. Results suggest that inclusive fitness theory with its expansions to include assortative and economic considerations is more general, powerful and relevant in analyzing social phenomena than kin selection theory with its emphasis on genetic relatedness. Merging economics with evolutionary theory will be necessary to reveal more about the nature of social dynamics.
Relative importance of social synergy, assortation and networks in the evolution of social cooperation Claudia Montoreano, Klaus Jaffe
The financial crisis clearly illustrated the importance of characterizing the level of ‘systemic’ risk associated with an entire credit network, rather than with single institutions. However, the interplay between financial distress and topological changes is still poorly understood. Here we analyze the quarterly interbank exposures among Dutch banks over the period 1998–2008, ending with the crisis. After controlling for the link density, many topological properties display an abrupt change in 2008, providing a clear – but unpredictable – signature of the crisis. By contrast, if the heterogeneity of banks' connectivity is controlled for, the same properties show a gradual transition to the crisis, starting in 2005 and preceded by an even earlier period during which anomalous debt loops could have led to the underestimation of counter-party risk. These early-warning signals are undetectable if the network is reconstructed from partial bank-specific data, as routinely done. We discuss important implications for bank regulatory policies.
Hod Lipson, an engineer who runs the Creative Machines Lab at Cornell University in Ithaca, New York, draws an analogy with the history of the computer. In the 1950s, computers were rare, expensive and owned mostly by large universities and businesses, and they required expert users to perform even relatively simple tasks. By the 1970s and 1980s, personal computers had emerged, and enthusiasts were assembling them from kits and writing their own software. Now practically everyone carries a powerful computer in their pocket and can do all manner of tasks with no programming knowledge. In the case of 3D printing, Lipson says, the transition from rare, limited and cumbersome to common, versatile and easy-to-use is happening quickly. “I used to say we're in the 1975 of printers, and now we're in the mid-80s already,” he says. “We're still at the point where most people are not comfortable using 3D printers and design tools. Those who are can make things a lot easier for themselves and get an edge.”
We propose a model that explains the reliable emergence of power laws (e.g., Zipf’s law) during the development of different human languages. The model incorporates the principle of least effort in communications, minimizing a combination of the information-theoretic communication inefficiency and direct signal cost. We prove a general relationship, for all optimal languages, between the signal cost distribution and the resulting distribution of signals. Zipf’s law then emerges for logarithmic signal cost distributions, which is the cost distribution expected for words constructed from letters or phonemes.
Zipf’s Law: Balancing Signal Usage Cost and Communication Efficiency Christoph Salge, Nihat Ay, Daniel Polani, Mikhail Prokopenko
We investigate the failure mechanisms of load sharing complex systems. The system is composed of multiple nodes or components whose failures are determined based on the interaction of their respective strengths and loads (or capacity and demand respectively) as well as the ability of a component to share its load with its neighbors when needed. We focus on two distinct mechanisms to model the interaction between components' strengths and loads. The failure mechanisms of these two models demonstrate temporal scaling phenomena, phase transitions and multiple distinct failure modes excited by extremal dynamics. For critical ranges of parameters the models demonstrate power law and exponential failure patterns. We identify the similarities and differences between the two mechanisms and the implications of our results to the failure mechanisms of complex systems in the real world.
Failure mechanisms of load sharing complex systems Shahnewaz Siddique, Vitali Volovoi
Since the year 2000, psychological research has tied gratitude to a host of benefits: the tendency to feel more hopeful and optimistic about one’s own future, better coping mechanisms for dealing with adversity and stress, less instances of depression and addiction, exercising more, and even sleeping better. The degree to which we’re grateful “can explain more variance in life satisfaction than such traits as love, forgiveness, social intelligence, and humor,” sings one recent paper. “Gratitude is strongly related to all aspects of well-being,” declares another.
We study a simple voter model with two competing parties. In particular, we represent the case of political elections, where people can choose to support one of the two competitors or to remain neutral. People interact in a social network and their opinion depends on those of their neighbors. Therefore, people may change opinion over time, i.e., they can support one competitor or none. The two competitors try to gain the people's consensus by interacting with their neighbors and also with other people. In particular, competitors define temporal connections, following a strategy, to interact with people they do not know, i.e., with all the people that are not their neighbors. We analyze the proposed model to investigate which network strategies are more advantageous, for the competitors, in order to gain the popular consensus. As result, we found that the best network strategy depends on the topology of the social network. Finally, we investigate how the charisma of competitors affects the outcomes of the proposed model.
Network Strategies in the Voter Model Marco Alberto Javarone
We argue for a decentralized approach where the participants in the social network keep their own data and perform computations in a distributed fashion without any central authority. A natural question that arises then is what distributed computations can be performed in such a decentralized setting. Our primary contribution is to lay the ground for expressing the question precisely. We refer to the underlying problem as the S3 problem: Scalable Secure computing in a Social network. Whereas scalability characterizes the spatial, computational and message complexity of the computation, the secure aspect of S3 encompasses accuracy and privacy.(...)
Computing in Social Networks
Andrei Giurgiu, Rachid Guerraoui, Kévin Huguenin, Anne-Marie Kermarrec
When making decisions, humans can observe many kinds of information about others' activities, but their effects on performance are not well understood. We investigated social learning strategies using a simple problem-solving task in which participants search a complex space, and each can view and imitate others' solutions. Results showed that participants combined multiple sources of information to guide learning, including payoffs of peers' solutions, popularity of solution elements among peers, similarity of peers' solutions to their own, and relative payoffs from individual exploration. Furthermore, performance was positively associated with imitation rates at both the individual and group levels. When peers' payoffs were hidden, popularity and similarity biases reversed, participants searched more broadly and randomly, and both quality and equity of exploration suffered. We conclude that when peers' solutions can be effectively compared, imitation does not simply permit scrounging, but it can also facilitate propagation of good solutions for further cumulative exploration.
Social Learning Strategies in Networked Groups
Thomas N. Wisdom, Xianfeng Song and Robert L. Goldstone
Network robustness research aims at finding a measure to quantify network robustness. Once such a measure has been established, we will be able to compare networks, to improve existing networks and to design new networks that are able to continue to perform well when it is subject to failures or attacks. In this paper we survey a large amount of robustness measures on simple, undirected and unweighted graphs, in order to offer a tool for network administrators to evaluate and improve the robustness of their network. The measures discussed in this paper are based on the concepts of connectivity (including reliability polynomials), distance, betweenness and clustering. Some other measures are notions from spectral graph theory, more precisely, they are functions of the Laplacian eigenvalues. In addition to surveying these graph measures, the paper also contains a discussion of their functionality as a measure for topological network robustness.
Graph measures and network robustness W. Ellens, R.E. Kooij
Collective, especially group-based, managerial decision making is crucial in organizations. Using an evolutionary theory approach to collective decision making, agent-based simulations were conducted to investigate how collective decision making would be affected by the agents' diversity in problem understanding and/or behavior in discussion, as well as by their social network structure. Simulation results indicated that groups with consistent problem understanding tended to produce higher utility values of ideas and displayed better decision convergence, but only if there was no group-level bias in collective problem understanding. Simulation results also indicated the importance of balance between selection-oriented (i.e., exploitative) and variation-oriented (i.e., explorative) behaviors in discussion to achieve quality final decisions. Expanding the group size and introducing non-trivial social network structure generally improved the quality of ideas at the cost of decision convergence. Simulations with different social network topologies revealed that collective decision making on small-world networks with high local clustering tended to achieve highest decision quality more often than on random or scale-free networks. Implications of this evolutionary theory and simulation approach for future managerial research on collective, group, and multi-level decision making are discussed.
Evolutionary perspectives on collective decision making: Studying the implications of diversity and social network structure with agent-based simulations Hiroki Sayama, Shelley D. Dionne, Francis J. Yammarino
Social influence is the process by which individuals adapt their opinion, revise their beliefs, or change their behavior as a result of social interactions with other people. In our strongly interconnected society, social influence plays a prominent role in many self-organized phenomena such as herding in cultural markets, the spread of ideas and innovations, and the amplification of fears during epidemics. Yet, the mechanisms of opinion formation remain poorly understood, and existing physics-based models lack systematic empirical validation. Here, we report two controlled experiments showing how participants answering factual questions revise their initial judgments after being exposed to the opinion and confidence level of others. Based on the observation of 59 experimental subjects exposed to peer-opinion for 15 different items, we draw an influence map that describes the strength of peer influence during interactions. A simple process model derived from our observations demonstrates how opinions in a group of interacting people can converge or split over repeated interactions. In particular, we identify two major attractors of opinion: (i) the expert effect, induced by the presence of a highly confident individual in the group, and (ii) the majority effect, caused by the presence of a critical mass of laypeople sharing similar opinions. Additional simulations reveal the existence of a tipping point at which one attractor will dominate over the other, driving collective opinion in a given direction. These findings have implications for understanding the mechanisms of public opinion formation and managing conflicting situations in which self-confident and better informed minorities challenge the views of a large uninformed majority.
Social Influence and the Collective Dynamics of Opinion Formation Mehdi Moussaid, Juliane E. Kaemmer, Pantelis P. Analytis, Hansjoerg Neth