 Your new post is loading...
 Your new post is loading...
|
Scooped by
Complexity Digest
March 7, 4:32 AM
|
GUANGHUI YAN, JIE TANG, HUAYAN PEI, and WENWEN CHANG Advances in Complex SystemsVol. 28, No. 01n02, 2550005 (2025) Considering that rumors propagation is affected by many factors in real life, based on the SIRS infectious disease model in complex networks, an extended ISRI rumor propagation model is proposed by using the probability function to define the influence mechanisms such as trust mechanism, and suspicion mechanism. First, dynamic equations are established for homogeneous and heterogeneous networks, and the rumor and rumor-free equilibrium points in the two networks are analyzed, respectively. Then, the basic reproduction number R0 is obtained by using the next generation matrix and derivative calculation methods. Next, the lyapunov function is constructed to discuss the local stability and global stability of the equilibrium point, and the influence of different parameters on the basic reproduction number R0. In addition, we selected ER network and BA network and found that population flow has a significant impact on the speed and scale of rumor propagation. At the same time, the trust mechanism can improve the propagation speed and scale, while the skepticism mechanism can inhibit the propagation speed, and it is more obvious in the BA network. The interaction between these mechanisms further affects the propagation characteristics of rumors in the network. Read the full article at: www.worldscientific.com
|
Scooped by
Complexity Digest
March 6, 12:43 PM
|
Manuel Baltieri, Martin Biehl, Matteo Capucci, Nathaniel Virgo The internal model principle, originally proposed in the theory of control of linear systems, nowadays represents a more general class of results in control theory and cybernetics. The central claim of these results is that, under suitable assumptions, if a system (a controller) can regulate against a class of external inputs (from the environment), it is because the system contains a model of the system causing these inputs, which can be used to generate signals counteracting them. Similar claims on the role of internal models appear also in cognitive science, especially in modern Bayesian treatments of cognitive agents, often suggesting that a system (a human subject, or some other agent) models its environment to adapt against disturbances and perform goal-directed behaviour. It is however unclear whether the Bayesian internal models discussed in cognitive science bear any formal relation to the internal models invoked in standard treatments of control theory. Here, we first review the internal model principle and present a precise formulation of it using concepts inspired by categorical systems theory. This leads to a formal definition of `model' generalising its use in the internal model principle. Although this notion of model is not a priori related to the notion of Bayesian reasoning, we show that it can be seen as a special case of possibilistic Bayesian filtering. This result is based on a recent line of work formalising, using Markov categories, a notion of `interpretation', describing when a system can be interpreted as performing Bayesian filtering on an outside world in a consistent way. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
February 28, 7:53 AM
|
Guozheng Lin, Ramon Escobedo, Xu Li, Tingting Xue, Zhangang Han, Clément Sire, Vishwesha Guttal, Guy Theraulaz How do animal groups dynamically adjust their collective behavior in response to environmental changes is an open and challenging question. Here, we investigate the mechanisms that allow fish schools to tune their collective state under stress, testing the hypothesis that these systems operate near criticality, a state maximizing sensitivity, responsiveness, and adaptability. We combine experiments and data-driven computational modeling to study how group size and stress influence the collective behavior of rummy-nose tetras (Hemigrammus rhodostomus). We quantify the collective state of fish schools using polarization, milling, and cohesion metrics and use a burst-and-coast model to infer the social interaction parameters that drive these behaviors. Our results indicate that group size modulates stress levels, with smaller groups experiencing higher baseline stress, likely due to a reduced social buffering effect. Under stress, fish adjust the strength of their social interactions in a way that leads the group into a critical state, thus enhancing its sensitivity to perturbations and facilitating rapid adaptation. However, large groups require an external stressor to enter the critical regime, whereas small groups are already near this state. Unlike previous studies suggesting that fish adjust their interaction network structure under risk, our results suggest that the intensity of social interactions, rather than network structure, governs collective state transitions. This simpler mechanism reduces cognitive demands while enabling dynamic adaptation. By revealing how stress and group size drive self-organization toward criticality, our study provides fundamental insights into the adaptability of collective biological systems and the emergent properties in animal groups. Read the full article at: www.biorxiv.org
|
Scooped by
Complexity Digest
February 24, 5:21 PM
|
M. A. Polo-González, A. P. Riascos, L. K. Eraso-Hernandez In this paper, we introduce a mathematical framework to assess the impact of damage, defined as the reduction of weight in a specific link, on identical oscillator systems governed by the Kuramoto model and coupled through weighted networks. We analyze how weight modifications in a single link affect the system when its global function is to achieve the synchronization of coupled oscillators starting from random initial phases. We introduce different measures that allow the identification of cases where damage enhances synchronization (antifragile response), deteriorates it (fragile response), or has no significant impact. Using numerical solutions of the Kuramoto model, we investigate the effects of damage on network links where antifragility emerges. Our analysis includes lollipop graphs of varying sizes and a comprehensive evaluation and all the edges of 109 non-isomorphic graphs with six nodes. The approach is general and can be applied to study antifragility in other oscillator systems with different coupling mechanisms, offering a pathway for the quantitative exploration of antifragility in diverse complex systems. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
February 22, 9:19 AM
|
Zhang Zhang, Arsham Ghavasieh, Jiang Zhang & Manlio De Domenico Nature Communications volume 16, Article number: 1605 (2025) Information dynamics plays a crucial role in complex systems, from cells to societies. Recent advances in statistical physics have made it possible to capture key network properties, such as flow diversity and signal speed, using entropy and free energy. However, large system sizes pose computational challenges. We use graph neural networks to identify suitable groups of components for coarse-graining a network and achieve a low computational complexity, suitable for practical application. Our approach preserves information flow even under significant compression, as shown through theoretical analysis and experiments on synthetic and empirical networks. We find that the model merges nodes with similar structural properties, suggesting they perform redundant roles in information transmission. This method enables low-complexity compression for extremely large networks, offering a multiscale perspective that preserves information flow in biological, social, and technological networks better than existing methods mostly focused on network structure. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
February 21, 3:13 PM
|
Fengli Xu, Qi Wang, Esteban Moro, Lin Chen, Arianna Salazar Miranda, Marta C. González, Michele Tizzoni, Chaoming Song, Carlo Ratti, Luis Bettencourt, Yong Li & James Evans Nature Human Behaviour (2025) The lived experience of urban life is shaped by personal mobility through dynamic relationships and resources, marked not only by access and opportunity, but also inequality and segregation. The recent availability of fine-grained mobility data and context attributes ranging from venue type to demographic mixture offer researchers a deeper understanding of experienced inequalities at scale, and pose many new questions. Here we review emerging uses of urban mobility behaviour data, and propose an analytic framework to represent mobility patterns as a temporal bipartite network between people and places. As this network reconfigures over time, analysts can track experienced inequality along three critical dimensions: social mixing with others from specific demographic backgrounds, access to different types of facilities, and spontaneous adaptation to unexpected events, such as epidemics, conflicts or disasters. This framework traces the dynamic, lived experiences of urban inequality and complements prior work on static inequalities experience at home and work. Xu et al. review applications of urban mobility behaviour data and propose a temporal bipartite network that reveals mobility patterns between people and places. It helps to track urban inequalities in social mixing, facility access and adaptation. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
February 21, 12:43 PM
|
Elma Dervić, Katharina Ledebur, Stefan Thurner & Peter Klimek Scientific Data volume 12, Article number: 215 (2025) Comorbidity networks have become a valuable tool to support data-driven biomedical research. Yet, studies often are severely hindered by the availability of the necessary comprehensive data, often due to the sensitivity of health care information. This study presents a population-wide comorbidity network dataset derived from 45 million hospital stays of 8.9 million patients over 17 years in Austria. We present co-occurrence networks of hospital diagnoses, stratified by age, sex, and observation period in a total of 96 different subgroups. For each of these groups we report a range of association measures (e.g., count data, and odds ratios) for all pairs of diagnoses. The dataset provides the possibility to researchers to create their own, tailor-made comorbidity networks from real patient data that can be used as a starting point in quantitative and machine learning methods. This data platform is intended to lead to deeper insights into a wide range of epidemiological, public health, and biomedical research questions. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
February 15, 8:47 AM
|
Waldemar Karwowski, et al. International Journal of Production Research Contemporary society faces a growing set of complex issues representing significant socioeconomic, health and well-being, environmental, and sustainability challenges. The discipline of industrial and systems engineering (ISE) can play an important role in addressing these issues. This paper identifies and discusses eight grand challenges for ISE. These grand challenges are (1) Artificial Intelligence (AI) For Business and Personal Use: Decision-Making and System Design and Operations, (2) Cybersecurity and Resilience, (3) Sustainability: Environment, Energy and Infrastructure, (4) Health Issues, (5) Social Issues, (6) Logistics and Supply Chain, (7) System Integration and Operations: Humans, Automation, and AI, and (8) Industrial and Systems Engineering Education. The discussed grand challenges were derived by accomplished ISE professionals who are the authors of this paper. The implications of the ISE grand challenges for education, training, research, and implementation of ISE principles and methodologies for the benefit of global society are discussed. Read the full article at: www.tandfonline.com
|
Scooped by
Complexity Digest
February 14, 4:06 AM
|
Jonah E. Friederich, Everton S. Medeiros, Sabine H. L. Klapp, Anna Zakharova In networked systems, stochastic resonance occurs as a collective phenomenon where the entire stochastic network resonates with a weak applied periodic signal. Beyond the interplay among the network coupling, the amplitude of the external periodic signal, and the intensity of stochastic fluctuations, the maintenance of stochastic resonance also crucially depends on the resonance capacity of each oscillator composing the network. This scenario raises the question: Can local defects in the ability of oscillators to resonate break down the stochastic resonance phenomenon in the entire network? Here, we investigate this possibility in complex networks of prototypical bistable oscillators in a double-well potential. We test the sustainability of stochastic resonance by considering a fraction of network oscillators with nonresonant potential landscapes. We find that the number of nonresonant oscillators depends nonlinearly on their dissimilarity from the rest of the network oscillators. In addition, we unravel the role of the network topology and coupling strength in maintaining, or suppressing, the stochastic resonance for different noise levels and number of nonresonant oscillators. Finally, we obtain a low-dimensional deterministic model confirming the results observed for the networks. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
February 13, 5:54 AM
|
Mario Franco, Gerardo Febres, Nelson Fernández, Carlos Gershenson Classification is a ubiquitous and fundamental problem in artificial intelligence and machine learning, with extensive efforts dedicated to developing more powerful classifiers and larger datasets. However, the classification task is ultimately constrained by the intrinsic properties of datasets, independently of computational power or model complexity. In this work, we introduce a formal entropy-based measure of classificability, which quantifies the inherent difficulty of a classification problem by assessing the uncertainty in class assignments given feature representations. This measure captures the degree of class overlap and aligns with human intuition, serving as an upper bound on classification performance for classification problems. Our results establish a theoretical limit beyond which no classifier can improve the classification accuracy, regardless of the architecture or amount of data, in a given problem. Our approach provides a principled framework for understanding when classification is inherently fallible and fundamentally ambiguous. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
February 12, 8:44 AM
|
Competence Modelling From the Perspective of Complex Systems Theories: A Systematic Literature Review Pedagogika Vol. 156 No. 4 This article aims to investigate how the notion of competence is conceptualised and modelled from the point of view of complex systems theories. Although the importance of competences and competency-based education is widely acknowledged, the concept of competence keeps evolving and it remains difficult to define it in today's constantly changing and uncertain VUCA world. Therefore, this study explores how the approach of complex systems, which is increasingly more often applied in educational research, can contribute to the definition of competence. The article presents a systematic review of 21 articles published in various databases in 2000-2023, revealing that from the perspective of complex systems, competence can be conceptualised both at the individual level and at the level of the whole system or organisation; it can follow a functionalist or contextual approach. Based on the research findings, it is assumed that in certain cases competence can be treated as emergence or even as an entire complex system, characterised by such properties as non-linearity, chaos, emergence, feedback loops, etc. Finally, this article reviews the variety of complexity-informed mathematical/computational and theoretical models utilised in the reviewed studies, the application of which opens up new avenues in overall educational research. Read the full article at: ejournals.vdu.lt
|
Scooped by
Complexity Digest
February 11, 10:33 AM
|
Anabele-Linda Pardi & Elizaveta Burina Scientific Reports volume 14, Article number: 10590 (2024) In the contemporary context of an acute need for sustainability and swift response to imminent crises such as global warming, pandemics and economic system disruptions, the focus on responsible decision making, ethical risk assessment and mitigation at all organizational levels is an overarching goal. Our aim is to introduce a deterministic method for investigating the stability of complex systems, in order to find the most important elements of such systems and their impact on different scenarios. The novelty of the current approach lies in its compact format and intuitive nature, designed to accommodate a limited amount of computational resources. The proposed modelling method involves the mapping of complex systems from a diversity of disciplines (economic markets, resource management domain and the community impact of suburbanisation) onto a sequence of chemical reactions and involving a subsequent mathematical analysis. Mapping the results back onto the use cases shows that one can retrieve a considerable amount of detail, making the modelling strategy general enough to be adaptable and scalable while also detailed enough to provide valuable insights for practical scenarios. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
February 6, 7:15 PM
|
Oriol Cabanas-Tirapu, Lluís Danús, Esteban Moro, Marta Sales-Pardo & Roger Guimerà Nature Communications volume 16, Article number: 1336 (2025) Modeling human mobility is critical to address questions in urban planning, sustainability, public health, and economic development. However, our understanding and ability to model flows between urban areas are still incomplete. At one end of the modeling spectrum we have gravity models, which are easy to interpret but provide modestly accurate predictions of flows. At the other end, we have machine learning models, with tens of features and thousands of parameters, which predict mobility more accurately than gravity models but do not provide clear insights on human behavior. Here, we show that simple machine-learned, closed-form models of mobility can predict mobility flows as accurately as complex machine learning models, and extrapolate better. Moreover, these models are simple and gravity-like, and can be interpreted similarly to standard gravity models. These models work for different datasets and at different scales, suggesting that they may capture the fundamental universal features of human mobility. Read the full article at: www.nature.com
|
|
Scooped by
Complexity Digest
March 6, 4:41 PM
|
Keith Y Patarroyo, Abhishek Sharma, Ian Seet, Ignas Packmore, Sara I. Walker, Leroy Cronin Quantifying the evolution and complexity of materials is of importance in many areas of science and engineering, where a central open challenge is developing experimental complexity measurements to distinguish random structures from evolved or engineered materials. Assembly Theory (AT) was developed to measure complexity produced by selection, evolution and technology. Here, we extend the fundamentals of AT to quantify complexity in inorganic molecules and solid-state periodic objects such as crystals, minerals and microprocessors, showing how the framework of AT can be used to distinguish naturally formed materials from evolved and engineered ones by quantifying the amount of assembly using the assembly equation defined by AT. We show how tracking the Assembly of repeated structures within a material allows us formalizing the complexity of materials in a manner accessible to measurement. We confirm the physical relevance of our formal approach, by applying it to phase transformations in crystals using the HCP to FCC transformation as a model system. To explore this approach, we introduce random stacking faults in closed-packed systems simplified to one-dimensional strings and demonstrate how Assembly can track the phase transformation. We then compare the Assembly of closed-packed structures with random or engineered faults, demonstrating its utility in distinguishing engineered materials from randomly structured ones. Our results have implications for the study of pre-genetic minerals at the origin of life, optimization of material design in the trade-off between complexity and function, and new approaches to explore material technosignatures which can be unambiguously identified as products of engineered design. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
March 6, 11:15 AM
|
Ben Piazza, Dániel L. Barabási, André Ferreira Castro, Giulia Menichetti, Albert-László Barabási The brain has long been conceptualized as a network of neurons connected by synapses. However, attempts to describe the connectome using established network science models have yielded conflicting outcomes, leaving the architecture of neural networks unresolved. Here, by performing a comparative analysis of eight experimentally mapped connectomes, we find that their degree distributions cannot be captured by the well-established random or scale-free models. Instead, the node degrees and strengths are well approximated by lognormal distributions, although these lack a mechanistic explanation in the context of the brain. By acknowledging the physical network nature of the brain, we show that neuron size is governed by a multiplicative process, which allows us to analytically derive the lognormal nature of the neuron length distribution. Our framework not only predicts the degree and strength distributions across each of the eight connectomes, but also yields a series of novel and empirically falsifiable relationships between different neuron characteristics. The resulting multiplicative network represents a novel architecture for network science, whose distinctive quantitative features bridge critical gaps between neural structure and function, with implications for brain dynamics, robustness, and synchronization. Read the full article at: www.biorxiv.org
|
Scooped by
Complexity Digest
February 25, 2:16 PM
|
Colin J. Carlson, Cole B. Brookson, Daniel J. Becker, Caroline A. Cummings, Rory Gibb, Fletcher W. Halliday, Alexis M. Heckley, Zheng Y. X. Huang, Torre Lavelle, Hailey Robertson, Amanda Vicente-Santos, Ciara M. Weets & Timothée Poisot Nature Reviews Biodiversity volume 1, pages 32–49 (2025) Emerging infectious diseases, biodiversity loss, and anthropogenic environmental change are interconnected crises with massive social and ecological costs. In this Review, we discuss how pathogens and parasites are responding to global change, and the implications for pandemic prevention and biodiversity conservation. Ecological and evolutionary principles help to explain why both pandemics and wildlife die-offs are becoming more common; why land-use change and biodiversity loss are often followed by an increase in zoonotic and vector-borne diseases; and why some species, such as bats, host so many emerging pathogens. To prevent the next pandemic, scientists should focus on monitoring and limiting the spread of a handful of high-risk viruses, especially at key interfaces such as farms and live-animal markets. But to address the much broader set of infectious disease risks associated with the Anthropocene, decision-makers will need to develop comprehensive strategies that include pathogen surveillance across species and ecosystems; conservation-based interventions to reduce human–animal contact and protect wildlife health; health system strengthening; and global improvements in epidemic preparedness and response. Scientists can contribute to these efforts by filling global gaps in disease data, and by expanding the evidence base for disease–driver relationships and ecological interventions. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
February 23, 9:15 AM
|
Ritam Pal, Aanjaneya Kumar, and M. S. Santhanam Phys. Rev. Lett. 134, 017401 Elections for public offices in democratic nations are large-scale examples of collective decision-making. As a complex system with a multitude of interactions among agents, we can anticipate that universal macroscopic patterns could emerge independent of microscopic details. Despite the availability of empirical election data, such universality, valid at all scales, countries, and elections, has not yet been observed. In this Letter, we propose a parameter-free voting model and analytically show that the distribution of the victory margin is driven by that of the voter turnout, and a scaled measure depending on margin and turnout leads to a robust universality. This is demonstrated using empirical election data from 34 countries, spanning multiple decades and electoral scales. The deviations from the model predictions and universality indicate possible electoral malpractices. We argue that this universality is a stylized fact indicating the competitive nature of electoral outcomes. Read the full article at: link.aps.org
|
Scooped by
Complexity Digest
February 21, 5:11 PM
|
Yu Tian, Sadamori Kojaku, Hiroki Sayama, Renaud Lambiotte Networks are powerful tools for modeling interactions in complex systems. While traditional networks use scalar edge weights, many real-world systems involve multidimensional interactions. For example, in social networks, individuals often have multiple interconnected opinions that can affect different opinions of other individuals, which can be better characterized by matrices. We propose a novel, general framework for modeling such multidimensional interacting dynamics: matrix-weighted networks (MWNs). We present the mathematical foundations of MWNs and examine consensus dynamics and random walks within this context. Our results reveal that the coherence of MWNs gives rise to non-trivial steady states that generalize the notions of communities and structural balance in traditional networks. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
February 21, 1:32 PM
|
Moritz U. G. Kraemer, et al. Nature volume 638, pages 623–635 (2025) Infectious disease threats to individual and public health are numerous, varied and frequently unexpected. Artificial intelligence (AI) and related technologies, which are already supporting human decision making in economics, medicine and social science, have the potential to transform the scope and power of infectious disease epidemiology. Here we consider the application to infectious disease modelling of AI systems that combine machine learning, computational statistics, information retrieval and data science. We first outline how recent advances in AI can accelerate breakthroughs in answering key epidemiological questions and we discuss specific AI methods that can be applied to routinely collected infectious disease surveillance data. Second, we elaborate on the social context of AI for infectious disease epidemiology, including issues such as explainability, safety, accountability and ethics. Finally, we summarize some limitations of AI applications in this field and provide recommendations for how infectious disease epidemiology can harness most effectively current and future developments in AI. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
February 16, 8:41 AM
|
Guillaume St-Onge, Jessica T. Davis, Laurent Hébert-Dufresne, Antoine Allard, Alessandra Urbinati, Samuel V. Scarpino, Matteo Chinazzi & Alessandro Vespignani Nature Medicine (2025) Aircraft wastewater surveillance has been proposed as a new approach to monitor the global spread of pathogens. Here we develop a computational framework providing actionable information for the design and estimation of the effectiveness of global aircraft-based wastewater surveillance networks (WWSNs). We study respiratory diseases of varying transmission potential and find that networks of 10–20 strategically placed wastewater sentinel sites can provide timely situational awareness and function effectively as an early warning system. The model identifies potential blind spots and suggests optimization strategies to increase WWSN effectiveness while minimizing resource use. Our findings indicate that increasing the number of sentinel sites beyond a critical threshold does not proportionately improve WWSN capabilities, emphasizing the importance of resource optimization. We show, through retrospective analyses, that WWSNs can notably shorten detection time for emerging pathogens. The approach presented offers a realistic analytic framework for the analysis of WWSNs at airports. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
February 14, 8:28 AM
|
Minze Wu, Tongfeng Weng, Zhuoming Ren, Xiaolu Chen and Chunzi Li EPLA Self-similarity of complex networks has been exhaustively explored but only concentrating on pairwise interactions between nodes. We restudy self-similar characteristics of networks from algebraic topological perspective. By virtue of a box covering technique, we generate consecutive renormalized networks with respect to different length scales. Interestingly, we find that the number of a specific order of clique in the renormalized networks presents a clearly scaling behavior. Moreover, we show that the growth pattern of cliques is likely to follow a universal principle for seemingly different kinds of real networks. Our work, for the first time, reveals the role of higher-order interactions in shaping self-similarity of complex networks. Read the full article at: iopscience.iop.org
|
Scooped by
Complexity Digest
February 13, 8:45 AM
|
Paolo Cornale, Michele Tizzani, Fabio Ciulla, Kyriaki Kalimeri, Elisa Omodei, Daniela Paolotti, Yelena Mejova Collective and individual action necessary to address climate change hinges on the public's understanding of the relevant scientific findings. In this study, we examine the use of scientific sources in the course of 14 years of public deliberation around climate change on one of the largest social media platforms, Reddit. We find that only 4.0% of the links in the Reddit posts, and 6.5% in the comments, point to domains of scientific sources, although these rates have been increasing in the past decades. These links are dwarfed, however, by the citations of mass media, newspapers, and social media, the latter of which peaked especially during 2019-2020. Further, scientific sources are more likely to be posted by users who also post links to sources having central-left political leaning, and less so by those posting more polarized sources. Unfortunately, scientific sources are not often used in response to links to unreliable sources. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
February 12, 12:48 PM
|
Matthew Russell Barnes, Vincenzo Nicosia, Richard G. Clegg In complex networks, the rich-get-richer effect (nodes with high degree at one point in time gain more degree in their future) is commonly observed. In practice this is often studied on a static network snapshot, for example, a preferential attachment model assumed to explain the more highly connected nodes or a rich-club}effect that analyses the most highly connected nodes. In this paper, we consider temporal measures of how success (measured here as node degree) propagates across time. By analogy with social mobility (a measure people moving within a social hierarchy through their life) we define hierarchical mobility to measure how a node's propensity to gain degree changes over time. We introduce an associated taxonomy of temporal correlation statistics including mobility, philanthropy and community. Mobility measures the extent to which a node's degree gain in one time period predicts its degree gain in the next. Philanthropy and community measure similar properties related to node neighbourhood. We apply these statistics both to artificial models and to 26 real temporal networks. We find that most of our networks show a tendency for individual nodes and their neighbourhoods to remain in similar hierarchical positions over time, while most networks show low correlative effects between individuals and their neighbourhoods. Moreover, we show that the mobility taxonomy can discriminate between networks from different fields. We also generate artificial network models to gain intuition about the behaviour and expected range of the statistics. The artificial models show that the opposite of the "rich-get-richer" effect requires the existence of inequality of degree in a network. Overall, we show that measuring the hierarchical mobility of a temporal network is an invaluable resource for discovering its underlying structural dynamics. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
February 12, 5:19 AM
|
Liad Mudrik, Melanie Boly, Stanislas Dehaene, Stephen M. Fleming, Victor Lamme, Anil Seth, Lucia Melloni Neuroscience & Biobehavioral Reviews As the field of consciousness science matures, the research agenda has expanded from an initial focus on the neural correlates of consciousness, to developing and testing theories of consciousness. Several theories have been put forward, each aiming to elucidate the relationship between consciousness and brain function. However, there is an ongoing, intense debate regarding whether these theories examine the same phenomenon. And, despite ongoing research efforts, it seems like the field has so far failed to converge around any single theory, and instead exhibits significant polarization. To advance this discussion, proponents of five prominent theories of consciousness—Global Neuronal Workspace Theory (GNWT), Higher-Order Theories (HOT), Integrated Information Theory (IIT), Recurrent Processing Theory (RPT), and Predictive Processing (PP)—engaged in a public debate in 2022, as part of the annual meeting of the Association for the Scientific Study of Consciousness (ASSC). They were invited to clarify the explananda of their theories, articulate the core mechanisms underpinning the corresponding explanations, and outline their foundational premises. This was followed by an open discussion that delved into the testability of these theories, potential evidence that could refute them, and areas of consensus and disagreement. Most importantly, the debate demonstrated that at this stage, there is more controversy than agreement between the theories, pertaining to the most basic questions of what consciousness is, how to identify conscious states, and what is required from any theory of consciousness. Addressing these core questions is crucial for advancing the field towards a deeper understanding and comparison of competing theories. Read the full article at: www.sciencedirect.com
|
Scooped by
Complexity Digest
February 8, 2:04 PM
|
Carlos M. Garrido, Francisco C. Santos, Elias Fernández Domingos, Ana M. Nunes & Jorge M. Pacheco Scientific Reports volume 15, Article number: 3865 (2025) The sustainable governance of Global Risky Commons (GRC)—global commons in the presence of a sizable risk of overall failure—is ubiquitous and requires a global solution. A prominent example is the mitigation of the adverse effects of global warming. In this context, the Collective Risk Dilemma (CRD) provides a convenient baseline model which captures many important features associated with GRC type problems by formulating them as problems of cooperation. Here we make use of the CRD to develop, for the first time, a bottom-up institutional governance framework of GRC. We find that the endogenous creation of local institutions that require a minimum consensus amongst group members—who, in turn, decide the nature of the institution (reward/punishment) via an electoral process—leads to higher overall cooperation than previously proposed designs, especially at low risk, proving that carrots and sticks implemented through local voting processes are more powerful than other designs. The stochastic evolutionary game theoretical model framework developed here further allows us to directly compare our results with those stemming from previous models of institutional governance. The model and the methods employed here are relevant and general enough to be applied to a variety of contemporary interdisciplinary problems. Read the full article at: www.nature.com
|