 Your new post is loading...
 Your new post is loading...
|
Scooped by
Complexity Digest
April 29, 11:38 AM
|
Vaibhav P. Pai, Léo Pio-Lopez, Megan M. Sperry, Patrick Erickson, Parande Tayyebi & Michael Levin Communications Biology volume 8, Article number: 646 (2025) Would transcriptomes change if cell collectives acquired a novel morphogenetic and behavioral phenotype in the absence of genomic editing, transgenes, heterologous materials, or drugs? We investigate the effects of morphology and nascent emergent life history on gene expression in the basal (no engineering, no sculpting) form of Xenobots —autonomously motile constructs derived from Xenopus embryo ectodermal cell explants. To investigate gene expression differences between cells in the context of an embryo with those that have been freed from instructive signals and acquired novel lived experiences, we compare transcriptomes of these basal Xenobots with age-matched Xenopus embryos. Basal Xenobots show significantly larger inter-individual gene variability than age-matched embryos, suggesting increased exploration of the transcriptional space. We identify at least 537 (non-epidermal) transcripts uniquely upregulated in these Xenobots. Phylostratigraphy shows a majority of transcriptomic shifts in the basal Xenobots towards evolutionarily ancient transcripts. Pathway analyses indicate transcriptomic shifts in the categories of motility machinery, multicellularity, stress and immune response, metabolism, thanatotranscriptome, and sensory perception of sound and mechanical stimuli. We experimentally confirm that basal Xenobots respond to acoustic stimuli via changes in behavior. Together, these data may have implications for evolution, biomedicine, and synthetic morphoengineering. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
April 28, 11:14 AM
|
Jin Liu, Wenbin Yu, ChengJun Zhang, JiaRui Gu, Louyang Yu, Guancheng Zhong Physica A: Statistical Mechanics and its Applications Identifying influential spreaders in complex networks remains a significant research topic. Previous studies have primarily focused on estimating the source of spread. Our research focuses on identifying whether an infected node has sustained infection capabilities during the spreading process. We define a node with a continuous infection capability as an active node with high node activity. We propose an algorithm based on node centrality to calculate the node activity. Unlike the established paradigms, we posit that node centrality is negatively correlated with node activity. Nodes with lower centrality exhibited higher activity and infectiousness. In contrast, nodes with higher centrality may have recovered from the infection and resulted in lower activity and a diminished capacity to propagate the virus. Experiments on artificial and empirical networks demonstrate that the proposed method can effectively identify nodes with sustained infection capability. The proposed method enhances our understanding of the spreading dynamics and provides a valuable tool for managing and controlling the spread of information or diseases in complex networks. Read the full article at: www.sciencedirect.com
|
Scooped by
Complexity Digest
April 26, 1:18 PM
|
Adriano B. L. Tort, Diego A. Laplagne, Andreas Draguhn & Joaquin Gonzalez Nature Reviews Neuroscience (2025) Neuronal activities that synchronize with the breathing rhythm have been found in humans and a host of mammalian species, not only in brain areas closely related to respiratory control or olfactory coding but also in areas linked to emotional and higher cognitive functions. In parallel, evidence is mounting for modulations of perception and action by the breathing cycle. In this Review, we discuss the extent to which brain activity locks to breathing across areas, levels of organization and brain states, and the physiological origins of this global synchrony. We describe how waves of sensory activity evoked by nasal airflow spread through brain circuits, synchronizing neuronal populations to the breathing cycle and modulating faster oscillations, cell assembly formation and cross-area communication, thereby providing a mechanistic link from breathing to neural coding, emotion and cognition. We argue that, through evolution, the breathing rhythm has come to shape network functions across species. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
April 25, 1:17 PM
|
Karan Singh, V. K. Chandrasekar, Wei Zou, Jürgen Kurths & D. V. Senthilkumar Communications Physics volume 8, Article number: 170 (2025) Cascading failures pose a significant threat to the stability and functionality of complex systems, making their mitigation a crucial area of research. While existing strategies aim to enhance network robustness, identifying an optimal set of critical nodes that mediates the cascade for protection remains a challenging task. Here, we present a robust and pragmatic framework that effectively mitigates the cascading failures by strategically identifying and securing critical nodes within the network. Our approach leverages a graph coloring technique to identify the critical nodes using the local network topology, and results in a minimal set of critical nodes to be protected yet maximally effective in mitigating the cascade thereby retaining a large fraction of the network intact. Our method outperforms existing mitigation strategies across diverse network configurations and failure scenarios. An extensive empirical validation using real-world networks highlights the practical utility of our framework, offering a promising tool for enhancing network robustness in complex systems. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
April 22, 4:06 AM
|
Arthur N. Montanari, Ana Elisa D. Barioni, Chao Duan, Adilson E. Motter The study of flocking in biological systems has identified conditions for self-organized collective behavior, inspiring the development of decentralized strategies to coordinate the dynamics of swarms of drones and other autonomous vehicles. Previous research has focused primarily on the role of the time-varying interaction network among agents while assuming that the agents themselves are identical or nearly identical. Here, we depart from this conventional assumption to investigate how inter-individual differences between agents affect the stability and convergence in flocking dynamics. We show that flocks of agents with optimally assigned heterogeneous parameters significantly outperform their homogeneous counterparts, achieving 20-40% faster convergence to desired formations across various control tasks. These tasks include target tracking, flock formation, and obstacle maneuvering. In systems with communication delays, heterogeneity can enable convergence even when flocking is unstable for identical agents. Our results challenge existing paradigms in multi-agent control and establish system disorder as an adaptive, distributed mechanism to promote collective behavior in flocking dynamics. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
April 19, 1:34 PM
|
Claudia Westermann Constructivist Foundations 20(2): 67–71 Context: In 2024, we celebrated the 60th-anniversary meeting of the American Society for Cybernetics (ASC. In more than eighty, mostly participatory, sessions, the conference stretched over five days. Under the overarching theme “Living Cybernetics Playing Language,” the conference encouraged participants to reflect on cybernetics in everyday contexts, ranging from academic research to the building of communities. This special issue of Constructivist Foundations contains four target articles that emerged from this conference, and their related discussions. Problem: Sixty years after the foundation of the ASC, defining cybernetics is still a challenge. Diversity, one could say, has haunted cybernetics since its inception. There are many practices that refer to cybernetics in many disciplinary fields and contexts, but do these different practices share anything or do they rely on different aspects of (historical) cybernetic practices? Method: I present the contributions to this special issue as case studies of cybernetic practice and diversity and expose them to the questions mentioned above. Results: Cybernetic practices are as diverse in their methods as the disciplines to which they relate. And yet, as the study of the four target articles and the related commentaries show, these practices all embrace uncertainty. This embrace is the foundation for a particular technicity in which formation and reflexivity become intertwined and co-evolve. In its engagement with contemporary challenges, cybernetic technicity introduces recursive links setting relations across boundaries. Contemporary cybernetic practice, through varied approaches, is a living tradition of enacting open futures. Implications: Cybernetic thinking does not necessarily become detectable through a common vocabulary or set of references, but rather through a particular inherent logic, which links thinking and doing in a recursive co-evolving relationship. Constructivist content: The editorial discusses second-order approaches to cybernetics. Read the full article at: constructivist.info
|
Scooped by
Complexity Digest
April 18, 6:36 PM
|
Ori Livson, Mikhail Prokopenko Incomputability results in formal logic and the Theory of Computation (i.e., incompleteness and undecidability) have deep implications for the foundations of mathematics and computer science. Likewise, Social Choice Theory, a branch of Welfare Economics, contains several impossibility results that place limits on the potential fairness, rationality and consistency of social decision-making processes. A formal relationship between Gödel's Incompleteness Theorems in formal logic, and Arrow's Impossibility Theorem in Social Choice Theory has long been conjectured. In this paper, we address this gap by bringing these two theories closer by introducing a general mathematical object called a Self-Reference System. Impossibility in Social Choice Theory is demonstrated to correspond to the impossibility of a Self-Reference System to interpret its own internal consistency. We also provide a proof of Gödel's First Incompleteness Theorem in the same terms. Together, this recasts Arrow's Impossibility Theorem as incomputability in the Gödelian sense. The incomputability results in both fields are shown to arise out of self-referential paradoxes. This is exemplified by a new proof of Arrow's Impossibility Theorem centred around Condorcet Paradoxes. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
April 13, 12:42 PM
|
THEOPI RADOS, et al. SCIENCE 3 Apr 2025 Vol 388, Issue 6742 pp. 109-115 The advent of clonal multicellularity is a critical evolutionary milestone, seen often in eukaryotes, rarely in bacteria, and only once in archaea. We show that uniaxial compression induces clonal multicellularity in haloarchaea, forming tissue-like structures. These archaeal tissues are mechanically and molecularly distinct from their unicellular lifestyle, mimicking several eukaryotic features. Archaeal tissues undergo a multinucleate stage followed by tubulin-independent cellularization, orchestrated by active membrane tension at a critical cell size. After cellularization, tissue junction elasticity becomes akin to that of animal tissues, giving rise to two cell types—peripheral (Per) and central scutoid (Scu) cells—with distinct actin and protein glycosylation polarity patterns. Our findings highlight the potential convergent evolution of a biophysical mechanism in the emergence of multicellular systems across domains of life. Read the full article at: www.science.org
|
Scooped by
Complexity Digest
April 11, 9:47 AM
|
Tiago P. Peixoto Network reconstruction is the task of inferring the unseen interactions between elements of a system, based only on their behavior or dynamics. This inverse problem is in general ill-posed, and admits many solutions for the same observation. Nevertheless, the vast majority of statistical methods proposed for this task -- formulated as the inference of a graphical generative model -- can only produce a ``point estimate,'' i.e. a single network considered the most likely. In general, this can give only a limited characterization of the reconstruction, since uncertainties and competing answers cannot be conveyed, even if their probabilities are comparable, while being structurally different. In this work we present an efficient MCMC algorithm for sampling from posterior distributions of reconstructed networks, which is able to reveal the full population of answers for a given reconstruction problem, weighted according to their plausibilities. Our algorithm is general, since it does not rely on specific properties of particular generative models, and is specially suited for the inference of large and sparse networks, since in this case an iteration can be performed in time O(Nlog2N) for a network of N nodes, instead of O(N2), as would be the case for a more naive approach. We demonstrate the suitability of our method in providing uncertainties and consensus of solutions (which provably increases the reconstruction accuracy) in a variety of synthetic and empirical cases. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
April 10, 10:58 AM
|
Chengbin Sun, Alfonso de Miguel-Arribas, Chaoqian Wang, Haoxiang Xia, Yamir Moreno In real-life complex systems, individuals often encounter multiple social dilemmas that cannot be effectively captured using a single-game model. Furthermore, the environment and limited resources both play a crucial role in shaping individuals' decision-making behaviors. In this study, we employ an adaptive control mechanism by which agents may benefit from their environment, thus redefining their individual fitness. Under this setting, a detailed examination of the co-evolution of individual strategies and resource allocation is carried. Through extensive simulations, we find that the advantageous environment mechanism not only significantly increases the proportion of cooperators in the system but also influences the resource distribution among individuals. Additionally, limited resources reinforce cooperative behaviors within the system while shaping the evolutionary dynamics and strategic interactions across different dilemmas. Once the system reaches equilibrium, resource distribution becomes highly imbalanced. To promote fairer resource allocation, we introduce a minimum resource guarantee mechanism. Our results show that this mechanism not only reduces disparities in resource distribution across the entire system and among individuals in different dilemmas but also significantly enhances cooperative behavior in higher resource intervals. Finally, to assess the robustness of our model, we further examine the influence of the advantageous environment on system-wide cooperation in small-world and random graph network models. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
April 7, 9:07 AM
|
Javier Argota Sánchez-Vaquerizo, Dirk Helbing Cities Volume 162, July 2025, 105901 Origin-destination (OD) matrices are essential for the analysis, planning, and simulation of urban areas, infrastructure, and transportation systems. However, they are often costly and time-consuming to determine, which reduces their potential use for informed decision-making and planning in cities. This research introduces a novel spatial econometric method that considers spatial spillover effects of socio-demographic, land use, and topological variables to directly estimate traffic OD flows between zones of the Metropolitan Area of Barcelona. Employing a two-part Hurdle model with gradient boosting (XGBoost), our approach achieves low error rates (MAE = 6.109, RMSE = 98.774), comparable to other established models also analyzed, but the proposed method's simplicity facilitates its practical application in urban planning and policy-making. This is illustrated by applying the proposed model to predict changes in vehicle flows resulting from the conversion of offices into other urban uses such as housing, commerce, education, or storage. Despite the related population increase, we expect a reduction in vehicle trips by up to 10 % even with limited spatial interventions. Our findings suggest the model's power to assess urban trends and policies, particularly in considering teleworking expansion, housing shortages, and contemporary planning practices promoting alternative mobility modes and densification. This research underscores the dual benefits of methodological innovation and practical policy application, marking a significant advancement in urban planning. Read the full article at: www.sciencedirect.com
|
Scooped by
Complexity Digest
April 3, 10:40 AM
|
Lea Karbevska & César A. Hidalgo EPJ Data Science volume 14, Article number: 21 (2025) Value chain data is crucial for navigating economic disruptions. Yet, despite its importance, we lack publicly available product-level value chain datasets, since resources such as the “World Input-Output Database”, “Inter-Country Input-Output Tables”, “EXIOBASE”, and “EORA”, lack information about products (e.g. Radio Receivers, Telephones, Electrical Capacitors, LCDs, etc.) and instead rely on aggregate industrial sectors (e.g. Electrical Equipment, Telecommunications). Here, we introduce a method that leverages ideas from machine learning and trade theory to infer product-level value chain relationships from fine-grained international trade data. We apply our method to data summarizing the exports and imports of 1200+ products and 250+ world regions (e.g. states in the U.S., prefectures in Japan, etc.) to infer value chain information implicit in their trade patterns. In short, we leverage the idea that due to global value chains, regions specialized in the export of a product will tend to specialize in the import of its inputs. We use this idea to develop a novel proportional allocation model to estimate product-level trade flows between regions and countries. This contributes a method to approximate value chain data at the product level that should be of interest to people working in logistics, trade, and sustainable development. Read the full article at: epjdatascience.springeropen.com
|
Scooped by
Complexity Digest
March 30, 10:47 AM
|
Robert L. Axtell, J. Doyne Farmer JOURNAL OF ECONOMIC LITERATURE VOL. 63, NO. 1, MARCH 2025 (pp. 197–287) Agent-based modeling (ABM) is a novel computational methodology for representing the behavior of individuals in order to study social phenomena. Its use is rapidly growing in many fields. We review ABM in economics and finance and highlight how it can be used to relax conventional assumptions in standard economic models. ABM has enriched our understanding of markets, industrial organization, labor, macro, development, public policy, and environmental economics. In financial markets, substantial accomplishments include understanding clustered volatility, market impact, systemic risk, and housing markets. We present a vision for how ABMs might be used in the future to build more realistic models of the economy and review some of the hurdles that must be overcome to achieve this. Read the full article at: www.aeaweb.org
|
|
Scooped by
Complexity Digest
April 28, 3:40 PM
|
Malbor Asllani, Alex Arenas Phys. Rev. E 111, 044306 Chimera states, marked by the coexistence of order and disorder in systems of coupled oscillators, have captivated researchers with their existence and intricate patterns. Despite ongoing advances, a full understanding of the genesis of chimera states remains challenging. This work formalizes a systematic method by evoking pattern formation theory to explain the emergence of chimera states in complex networks, in a similar way to how Turing patterns are produced. Employing linear stability analysis and the spectral properties of complex networks, we show that the randomness of network topology, as reflected in the localization of the graph Laplacian eigenvectors, determines the emergence of chimera patterns, underscoring the critical role of network structure. In particular, this approach explains how amplitude and phase chimeras arise separately and explores whether phase chimeras can be chaotic or not. Our findings suggest that chimeras result from the interplay between local and global dynamics at different timescales. Validated through simulations and empirical network analyses, our method enriches the understanding of coupled oscillator dynamics. Read the full article at: link.aps.org
|
Scooped by
Complexity Digest
April 26, 2:12 PM
|
Enrique M. Muro, Fernando J. Ballesteros, Bartolo Luque, and Jordi Bascompte PNAS 122 (13) e2422968122 The origin of eukaryotes represents one of the most significant events in evolution since it allowed the posterior emergence of multicellular organisms. Yet, it remains unclear how existing regulatory mechanisms of gene activity were transformed to allow this increase in complexity. Here, we address this question by analyzing the length distribution of proteins and their corresponding genes for 6,519 species across the tree of life. We find a scale-invariant relationship between gene mean length and variance maintained across the entire evolutionary history. Using a simple model, we show that this scale-invariant relationship naturally originates through a simple multiplicative process of gene growth. During the first phase of this process, corresponding to prokaryotes, protein length follows gene growth. At the onset of the eukaryotic cell, however, mean protein length stabilizes around 500 amino acids. While genes continued growing at the same rate as before, this growth primarily involved noncoding sequences that complemented proteins in regulating gene activity. Our analysis indicates that this shift at the origin of the eukaryotic cell was due to an algorithmic phase transition equivalent to that of certain search algorithms triggered by the constraints in finding increasingly larger proteins. Read the full article at: www.pnas.org
|
Scooped by
Complexity Digest
April 25, 3:15 PM
|
Selin E. Nugen AI & SOCIETY This paper examines the application of evolutionary analogy in AI (artificial intelligence) research, focussing on narratives that perpetuate individuated and autonomous imaginaries of AI systems through biological diction. AI research has long drawn inspiration from evolution to design and predict algorithmic change. Occasionally, these narratives extend inspiration to reimagine AI as a non-human species subject to the same evolutionary pressures as biological organisms. As AI technologies embed more pervasively in public life and require critical perspectives on their social impacts, these comparisons in AI discourse raise critical questions about the limits of and responsibility in employing such analogies and their potential impact on how broader audiences consume and perceive AI systems. This paper examines the diverse ways and intentions behind how evolution is invoked in AI research narratives by analysing the adaptation of individuating evolutionary language and concepts across three fields of AI-related research: evolutionary computing, Artificial Life, and existential risk. It scrutinises the challenge of accurate scientific communication when drawing inspiration from biological evolution and assigning organismal attributes to digital technologies whilst decontextualising wider evolutionary scholarly discourses. I argue that the intertwined history between evolutionary theory and technological change paired with the potential risks to wider perceptions of AI and biological evolution, requires (1) strategic consideration about the limits of evolutionary analogies in categorising AI in relation to biological organisms, balancing creative inspiration with scientific caution and (2) active, collaborative multidisciplinary engagement with addressing potential misinformation, recognising that biological narratives have sociopolitical implications that influence human interaction with machines. Read the full article at: link.springer.com
|
Scooped by
Complexity Digest
April 23, 4:07 AM
|
Debates about artificial intelligence (AI) tend to revolve around whether large models are intelligent, autonomous agents. Some AI researchers and commentators speculate that we are on the cusp of creating agents with artificial general intelligence (AGI), a prospect anticipated with both elation and anxiety. There have also been extensive conversations about cultural and social consequences of large models, orbiting around two foci: immediate effects of these systems as they are currently used, and hypothetical futures when these systems turn into AGI agents—perhaps even superintelligent AGI agents. But this discourse about large models as intelligent agents is fundamentally misconceived. Combining ideas from social and behavioral sciences with computer science can help us to understand AI systems more accurately. Large models should not be viewed primarily as intelligent agents but as a new kind of cultural and social technology, allowing humans to take advantage of information other humans have accumulated. HENRY FARRELL, ALISON GOPNIK, COSMA SHALIZI, AND JAMES EVANS Authors Info & Affiliations SCIENCE 13 Mar 2025 Vol 387, Issue 6739 Read the full article at: www.science.org
|
Scooped by
Complexity Digest
April 20, 10:33 AM
|
Ricard Solé, Manlio De Domenico The path toward the emergence of life in our biosphere involved several key events allowing for the persistence, reproduction and evolution of molecular systems. All these processes took place in a given environmental context and required both molecular diversity and the right non-equilibrium conditions to sustain and favour complex self-sustaining molecular networks capable of evolving by natural selection. Life is a process that departs from non-life in several ways and cannot be reduced to standard chemical reactions. Moreover, achieving higher levels of complexity required the emergence of novelties. How did that happen? Here, we review different case studies associated with the early origins of life in terms of phase transitions and bifurcations, using symmetry breaking and percolation as two central components. We discuss simple models that allow for understanding key steps regarding life origins, such as molecular chirality, the transition to the first replicators and cooperators, the problem of error thresholds and information loss, and the potential for "order for free" as the basis for the emergence of life. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
April 19, 10:24 AM
|
Thomas F. Varley, Pedro A. M. Mediano, Alice Patania, Josh Bongard The study of irreducible higher-order interactions has become a core topic of study in complex systems. Two of the most well-developed frameworks, topological data analysis and multivariate information theory, aim to provide formal tools for identifying higher-order interactions in empirical data. Despite similar aims, however, these two approaches are built on markedly different mathematical foundations and have been developed largely in parallel. In this study, we present a head-to-head comparison of topological data analysis and information-theoretic approaches to describing higher-order interactions in multivariate data; with the aim of assessing the similarities and differences between how the frameworks define ``higher-order structures." We begin with toy examples with known topologies, before turning to naturalistic data: fMRI signals collected from the human brain. We find that intrinsic, higher-order synergistic information is associated with three-dimensional cavities in a point cloud: shapes such as spheres are synergy-dominated. In fMRI data, we find strong correlations between synergistic information and both the number and size of three-dimensional cavities. Furthermore, we find that dimensionality reduction techniques such as PCA preferentially represent higher-order redundancies, and largely fail to preserve both higher-order information and topological structure, suggesting that common manifold-based approaches to studying high-dimensional data are systematically failing to identify important features of the data. These results point towards the possibility of developing a rich theory of higher-order interactions that spans topological and information-theoretic approaches while simultaneously highlighting the profound limitations of more conventional methods. Read the full article at: arxiv.org
|
Suggested by
Takaya Arita
April 18, 4:21 PM
|
Takaya Arita, Wenxian Zheng, Reiji Suzuki, Fuminori Akiba This study explored how large language models (LLMs) perform in two areas related to art: writing critiques of artworks and reasoning about mental states (Theory of Mind, or ToM) in art-related situations. For the critique generation part, we built a system that combines Noel Carroll's evaluative framework with a broad selection of art criticism theories. The model was prompted to first write a full-length critique and then shorter, more coherent versions using a step-by-step prompting process. These AI-generated critiques were then compared with those written by human experts in a Turing test-style evaluation. In many cases, human subjects had difficulty telling which was which, and the results suggest that LLMs can produce critiques that are not only plausible in style but also rich in interpretation, as long as they are carefully guided. In the second part, we introduced new simple ToM tasks based on situations involving interpretation, emotion, and moral tension, which can appear in the context of art. These go beyond standard false-belief tests and allow for more complex, socially embedded forms of reasoning. We tested 41 recent LLMs and found that their performance varied across tasks and models. In particular, tasks that involved affective or ambiguous situations tended to reveal clearer differences. Taken together, these results help clarify how LLMs respond to complex interpretative challenges, revealing both their cognitive limitations and potential. While our findings do not directly contradict the so-called Generative AI Paradox--the idea that LLMs can produce expert-like output without genuine understanding--they suggest that, depending on how LLMs are instructed, such as through carefully designed prompts, these models may begin to show behaviors that resemble understanding more closely than we might assume. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
April 12, 12:47 PM
|
Aaron Clauset, Barbara F. Walter, Lars-Erik Cederman, Kristian Skrede Gleditsch Although very large wars remain an enduring threat in global politics, we lack a clear understanding of how some wars become large and costly, while most do not. There are three possibilities: large conflicts start with and maintain intense fighting, they persist over a long duration, or they escalate in intensity over time. Using detailed within-conflict data on civil and interstate wars 1946--2008, we show that escalation dynamics -- variations in fighting intensity within an armed conflict -- play a fundamental role in producing large conflicts and are a generic feature of both civil and interstate wars. However, civil wars tend to deescalate when they become very large, limiting their overall severity, while interstate wars exhibit a persistent risk of continual escalation. A non-parametric model demonstrates that this distinction in escalation dynamics can explain the differences in the historical sizes of civil vs. interstate wars, and explain Richardson's Law governing the frequency and severity of interstate conflicts over the past 200 years. Escalation dynamics also drive enormous uncertainty in forecasting the eventual sizes of both hypothetical and ongoing civil wars, indicating a need to better understand the causes of escalation and deescalation within conflicts. The close relationship between the size, and hence the cost, of an armed conflict and its potential for escalation has broad implications for theories of conflict onset or termination and for risk assessment in international relations. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
April 11, 8:21 AM
|
Xingyu Pan, Zerong Guo Chaos, Solitons & Fractals Volume 196, July 2025, 116369 Many real-world systems comprise fundamental elements that exhibit mutual exclusion and alternating activation. Here, we develop a framework for the evolution of network structures that captures the behaviors of such systems. We define the dynamic resilience of temporal networks using variational rates to measure how the evolutionary trajectories of network structures diverge under perturbations. We show that perturbations to specific edges and states of mutually exclusive elements can cause evolutionary trajectories of network structures to deviate significantly from the original path. Furthermore, we demonstrate that traditional resilience factors do not affect dynamic resilience, which is instead governed by mutual exclusion within our framework. Our results advance the study of network resilience, particularly for networks with evolving structures, offering a novel perspective for identifying crucial perturbations within the context of the states of mutually exclusive elements. Read the full article at: www.sciencedirect.com
|
Suggested by
Fil Menczer
April 7, 2:53 PM
|
Matthew R. DeVerna, Francesco Pierri, Yong-Yeol Ahn, Santo Fortunato, Alessandro Flammini & Filippo Menczer npj Complexity volume 2, Article number: 11 (2025) Understanding how misinformation affects the spread of disease is crucial for public health, especially given recent research indicating that misinformation can increase vaccine hesitancy and discourage vaccine uptake. However, it is difficult to investigate the interaction between misinformation and epidemic outcomes due to the dearth of data-informed holistic epidemic models. Here, we employ an epidemic model that incorporates a large, mobility-informed physical contact network as well as the distribution of misinformed individuals across counties derived from social media data. The model allows us to simulate various scenarios to understand how epidemic spreading can be affected by misinformation spreading through one particular social media platform. Using this model, we compare a worst-case scenario, in which individuals become misinformed after a single exposure to low-credibility content, to a best-case scenario where the population is highly resilient to misinformation. We estimate the additional portion of the U.S. population that would become infected over the course of the COVID-19 epidemic in the worst-case scenario. This work can provide policymakers with insights about the potential harms of exposure to online vaccine misinformation. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
April 6, 12:47 PM
|
A new suggestion that complexity increases over time, not just in living organisms but in the nonliving world, promises to rewrite notions of time and evolution. Read the full article at: www.quantamagazine.org
|
Scooped by
Complexity Digest
April 1, 11:47 AM
|
Stephen Polasky, Marten Scheffer, and John M. Anderies 122 (14) e2320528122 A well-functioning society requires well-functioning institutions that ensure prosperity, fair distribution of wealth, social participation, security, and informative media. Such institutions are built on a foundation of trust. However, while trust is essential for economic success and good governance, interconnected mechanisms inherent in weakly governed market economies tend to undermine the very trust on which such success depends. These mechanisms include the intrinsic tendency for inequality to grow, media to boost perceived unfairness, and self-interest to gain rewards at the expense of others. These mechanisms, if left unchecked, allow wealth concentration to result in state capture where institutions facilitate further wealth concentration instead of the promoting the common good. As a result, people may become alienated and untrusting of fellow citizens and of institutions. Several democracies now experience such dynamics, the United States being a prime example. We discuss ways in which well-functioning democracies can design institutions to help avoid this social trap, and the much harder challenge of escaping the trap once in it. Successful cases such as the ability of Scandinavian democracies to maintain high-trust, and the US progressive era in the early 20th century provide instructive examples. Read the full article at: www.pnas.org
|