Who are we, and how do we relate to each other? Luciano Floridi, one of the leading figures in contemporary philosophy, argues that the explosive developments in Information and Communication Technologies (ICTs) is changing the answer to these fundamental human questions.
As the boundaries between life online and offline break down, and we become seamlessly connected to each other and surrounded by smart, responsive objects, we are all becoming integrated into an "infosphere". Personas we adopt in social media, for example, feed into our 'real' lives so that we begin to live, as Floridi puts in, "onlife". Following those led by Copernicus, Darwin, and Freud, this metaphysical shift represents nothing less than a fourth revolution.
"Onlife" defines more and more of our daily activity - the way we shop, work, learn, care for our health, entertain ourselves, conduct our relationships; the way we interact with the worlds of law, finance, and politics; even the way we conduct war. In every department of life, ICTs have become environmental forces which are creating and transforming our realities. How can we ensure that we shall reap their benefits? What are the implicit risks? Are our technologies going to enable and empower us, or constrain us? Floridi argues that we must expand our ecological and ethical approach to cover both natural and man-made realities, putting the 'e' in an environmentalism that can deal successfully with the new challenges posed by our digital technologies and information society.
The book focuses on Social Collective Intelligence, a term used to denote a class of socio-technical systems that combine, in a coordinated way, the strengths of humans, machines and collectives in terms of competences, knowledge and problem solving capabilities with the communication, computing and storage capabilities of advanced ICT. Social Collective Intelligence opens a number of challenges for researchers in both computer science and social sciences; at the same time it provides an innovative approach to solve challenges in diverse application domains, ranging from health to education and organization of work. The book will provide a cohesive and holistic treatment of Social Collective Intelligence, including challenges emerging in various disciplines (computer science, sociology, ethics) and opportunities for innovating in various application areas. By going through the book the reader will gauge insight and knowledge into the challenges and opportunities provided by this new, exciting, field of investigation. Benefits for scientists will be in terms of accessing a comprehensive treatment of the open research challenges in a multidisciplinary perspective. Benefits for practitioners and applied researchers will be in terms of access to novel approaches to tackle relevant problems in their field. Benefits for policy-makers and public bodies representatives will be in terms of understanding how technological advances can support them in supporting the progress of society and economy.
We argue that a critical difference distinguishing machines from organisms and computers from brains is not complexity in a structural sense, but a difference in dynamical organization that is not well accounted for by current complexity measures. We propose a measure of the complexity of a system that is largely orthogonal to computational, information theoretic, or thermodynamic conceptions of structural complexity. What we call a system’s dynamical depth is a separate dimension of system complexity that measures the degree to which it exhibits discrete levels of nonlinear dynamical organization in which successive levels are distinguished by local entropy reduction and constraint generation. A system with greater dynamical depth than another consists of a greater number of such nested dynamical levels. Thus, a mechanical or linear thermodynamic system has less dynamical depth than an inorganic self-organized system, which has less dynamical depth than a living system. Including an assessment of dynamical depth can provide a more precise and systematic account of the fundamental difference between inorganic systems (low dynamical depth) and living systems (high dynamical depth), irrespective of the number of their parts and the causal relations between them.
Complexity and Dynamical Depth Terrence Deacon and Spyridon Koutroufinis
The emerging field of computational social science (CSS) is devoted to the pursuit of interdisciplinary social science research from an information processing perspective, through the medium of advanced computing and information technologies.
This reader-friendly textbook/reference is the first work of its kind to provide a comprehensive and unified Introduction to Computational Social Science. Four distinct methodological approaches are examined in particular detail, namely automated social information extraction, social network analysis, social complexity theory, and social simulation modeling. The coverage of each of these approaches is supported by a discussion of the historical context and motivations, as well as by a list of recommended texts for further reading.
The International System is a self-organized system and shows emergent behavior. During the timeframe (1495 - 1945), a finite-time singularity and four accompanying accelerating log-periodic cycles shaped the dynamics of the International System. The accelerated growth of the connectivity of the regulatory network of the International System, in combination with its anarchistic structure, produce and shape the war dynamics of the system. Accelerated growth of the connectivity of the International system is fed by population growth and the need for social systems to fulfill basic requirements. The finite-time singularity and accompanying log-periodic oscillations were instrumental in the periodic reorganization of the regulatory network of the International System, and contributed to a long-term process of social expansion and integration in Europa. The singularity dynamic produced a series of organizational innovations. At the critical time of the singularity (1939) the connectivity of the system reached a critical threshold, resulting in a critical transition. This critical transition caused a fundamental reorganization of the International System: Europe transformed from an anarchistic system to cooperative security community. This critical transition also marks the actual globalization of the International System. During the life span of cycles, the war dynamics show chaotic characteristics. Various early-warning signals can be identified, and can probably be used in the current International System. These findings have implications for the social sciences and historical research.
Network science is a rapidly emerging field of study that encompasses mathematics, computer science, physics, and engineering. A key issue in the study of complex networks is to understand the collective behavior of the various elements of these networks.
Although the results from graph theory have proven to be powerful in investigating the structures of complex networks, few books focus on the algorithmic aspects of complex network analysis. Filling this need, Complex Networks: An Algorithmic Perspective supplies the basic theoretical algorithmic and graph theoretic knowledge needed by every researcher and student of complex networks.
This book is about specifying, classifying, designing, and implementing mostly sequential and also parallel and distributed algorithms that can be used to analyze the static properties of complex networks. Providing a focused scope which consists of graph theory and algorithms for complex networks, the book identifies and describes a repertoire of algorithms that may be useful for any complex network.
Provides the basic background in terms of graph theorySupplies a survey of the key algorithms for the analysis of complex networksPresents case studies of complex networks that illustrate the implementation of algorithms in real-world networks, including protein interaction networks, social networks, and computer networks
Requiring only a basic discrete mathematics and algorithms background, the book supplies guidance that is accessible to beginning researchers and students with little background in complex networks. To help beginners in the field, most of the algorithms are provided in ready-to-be-executed form.
Social systems have recently attracted much attention, with attempts to understand social behavior with the aid of statistical mechanics applied to complex systems. Collective properties of such systems emerge from couplings between components, for example, individual persons, transportation nodes such as airports or subway stations, and administrative districts. Among various collective properties, criticality is known as a characteristic property of a complex system, which helps the systems to respond flexibly to external perturbations. This work considers the criticality of the urban transportation system entailed in the massive smart card data on the Seoul transportation network. Analyzing the passenger flow on the Seoul bus system during one week, we find explicit power-law correlations in the system, that is, power-law behavior of the strength correlation function of bus stops and verify scale invariance of the strength fluctuations. Such criticality is probed by means of the scaling and renormalization analysis of the modified gravity model applied to the system. Here a group of nearby (bare) bus stops are transformed into a (renormalized) “block stop” and the scaling relations of the network density turn out to be closely related to the fractal dimensions of the system, revealing the underlying structure. Specifically, the resulting renormalized values of the gravity exponent and of the Hill coefficient give a good description of the Seoul bus system: The former measures the characteristic dimensionality of the network whereas the latter reflects the coupling between distinct transportation modes. It is thus demonstrated that such ideas of physics as scaling and renormalization can be applied successfully to social phenomena exemplified by the passenger flow.
Emergence of Criticality in the Transportation Passenger Flow: Scaling and Renormalization in the Seoul Bus System
Questions of values, ontologies, ethics, aesthetics, discourse, origins, language, literature, and meaning do not lend themselves readily, or traditionally, to equations, probabilities, and models. However, with the increased adoption of natural science tools in economics, anthropology, and political science—to name only a few social scientific fields highlighted in this volume—quantitative methods in the humanities are becoming more common.
The theory of complexity holds significant promise for better understanding social and human phenomena based on interactions among the participating "agents," whatever they may be: a thought, a person, a conversation, a sentence, or an email. Such systems can exhibit phase transitions, feedback loops, self-organization, and emergent properties. These dynamic systems lend themselves naturally to the kind of analysis made possible by models and simulations developed with complex science tools. This volume offers a tour of quantitative analyses, models, and simulations of humanities and social science phenomena that have been historically the purview of qualitative methods.
Emergence is a common phenomenon, and it is also a general and important concept in complex dynamic systems like artificial societies. Usually, artificial societies are used for assisting in resolving several complex social issues (e.g., emergency management, intelligent transportation system) with the aid of computer science. The levels of an emergence may have an effect on decisions making, and the occurrence and degree of an emergence are generally perceived by human observers. However, due to the ambiguity and inaccuracy of human observers, to propose a quantitative method to measure emergences in artificial societies is a meaningful and challenging task. This article mainly concentrates upon three kinds of emergences in artificial societies, including emergence of attribution, emergence of behavior, and emergence of structure. Based on information entropy, three metrics have been proposed to measure emergences in a quantitative way. Meanwhile, the correctness of these metrics has been verified through three case studies (the spread of an infectious influenza, a dynamic microblog network, and a flock of birds) with several experimental simulations on the Netlogo platform. These experimental results confirm that these metrics increase with the rising degree of emergences. In addition, this article also has discussed the limitations and extended applications of these metrics.
Information Entropy-Based Metrics for Measuring Emergences in Artificial Societies Mingsheng Tang and Xinjun Mao
Socioinformatics is a new scientific approach to study the interactions between humans and IT. These proceedings are a collection of the contributions during a workshop of the Gesellschaft für Informatik (GI). Researchers in this emerging field discuss the main aspects of interactions between IT and humans with respect to; social connections, social changes, acceptance of IT and the social conditions affecting this acceptance, effects of IT on humans and in response changes of IT, structures of the society and the influence of IT on these structures, changes of metaphysics influenced by IT and the social context of a knowledge society.
More than 40 y ago, Schelling introduced one of the first agent-based models in the social sciences. The model showed that even if people only have a mild preference for living with neighbors of the same color, complete segregation will occur. This model has been much discussed by social scientists and analyzed by physicists using analogies with spin-1 Ising models and other systems. Here, we study the metapopulation version of the model, which mimics the division of a city into neighborhoods, and we present the first analysis to our knowledge that gives detailed information about the structure of equilibria and explicit formulas for their densities.
Topological properties of networks are widely applied to study the link-prediction problem recently. Common Neighbors, for example, is a natural yet efficient framework. Many variants of Common Neighbors have been thus proposed to further boost the discriminative resolution of candidate links. In this paper, we reexamine the role of network topology in predicting missing links from the perspective of information theory, and present a practical approach based on the mutual information of network structures. It not only can improve the prediction accuracy substantially, but also experiences reasonable computing complexity.
The stability analysis of socioeconomic systems has been centred on answering whether small perturbations when a system is in a given quantitative state will push the system permanently to a different quantitative state. However, typically the quantitative state of socioeconomic systems is subject to constant change. Therefore, a key stability question that has been under-investigated is how strongly the conditions of a system itself can change before the system moves to a qualitatively different behaviour, i.e. how structurally stable the systems is. Here, we introduce a framework to investigate the structural stability of socioeconomic systems formed by a network of interactions among agents competing for resources. We measure the structural stability of the system as the range of conditions in the distribution and availability of resources compatible with the qualitative behaviour in which all the constituent agents can be self-sustained across time. To illustrate our framework, we study an empirical representation of the global socioeconomic system formed by countries sharing and competing for multinational companies used as proxy for resources. We demonstrate that the structural stability of the system is inversely associated with the level of competition and the level of heterogeneity in the distribution of resources. Importantly, we show that the qualitative behaviour of the observed global socioeconomic system is highly sensitive to changes in the distribution of resources. We believe that this work provides a methodological basis to develop sustainable strategies for socioeconomic systems subject to constantly changing conditions.
How structurally stable are global socioeconomic systems? Serguei Saavedra, Rudolf P. Rohr, Luis J. Gilarranz, Jordi Bascompte
* Editorial by Paul Bourgine * Call for new CS-DC e-Laboratories and e-Departments * Meeting of the UNESCO UniTwin CS-DC at ECCS’14: Science, Policy, and Applications * International workshop on Contagion Dynamics in Socio economic Systems * Tools: CS DC web conferencing tool * News from the CS-DC e-Laboratories * e-Laboratory on Education * e-Laboratory on Climate System / Human System Interaction * e-Laboratory on Human-trace * e-Laboratory on Situated Collective Intelligence
Understanding norms is a key challenge in sociology. Nevertheless, there is a lack of dynamical models explaining how one of several possible behaviors is established as a norm and under what conditions. Analysing an agent-based model, we identify interesting parameter dependencies that imply when two behaviors will coexist or when a shared norm will emerge in a heterogeneous society, where different populations have incompatible preferences. Our model highlights the importance of randomness, spatial interactions, non-linear dynamics, and self-organization. It can also explain the emergence of unpopular norms that do not maximize the collective benefit. Furthermore, we compare behavior-based with preference-based punishment and find interesting results concerning hypocritical punishment. Strikingly, pressuring others to perform the same public behavior as oneself is more effective in promoting norms than pressuring others to meet one’s own private preference. Finally, we show that adaptive group pressure exerted by randomly occuring, local majorities may create norms under conditions where different behaviors would normally coexist.
Complex systems are increasingly being viewed as distributed information processing systems, particularly in the domains of computational neuroscience, bioinformatics and Artificial Life. This trend has resulted in a strong uptake in the use of (Shannon) information-theoretic measures to analyse the dynamics of complex systems in these fields. We introduce the Java Information Dynamics Toolkit (JIDT): a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data. While the toolkit provides classic information-theoretic measures (e.g. entropy, mutual information, conditional mutual information), it ultimately focusses on implementing higher-level measures for information dynamics. That is, JIDT focusses on quantifying information storage, transfer and modification, and the dynamics of these operations in space and time. For this purpose, it includes implementations of the transfer entropy and active information storage, their multivariate extensions and local or pointwise variants. JIDT provides implementations for both discrete and continuous-valued data for each measure, including various types of estimator for continuous data (e.g. Gaussian, box-kernel and Kraskov-Stoegbauer-Grassberger) which can be swapped at run-time due to Java's object-oriented polymorphism. Furthermore, while written in Java, the toolkit can be used directly in MATLAB, GNU Octave and Python. We present the principles behind the code design, and provide several examples to guide users
"JIDT: An information-theoretic toolkit for studying the dynamics of complex systems" Joseph T. Lizier, arXiv:1408.3270, 2014 http://arxiv.org/abs/1408.3270