 Your new post is loading...
 Your new post is loading...
|
Scooped by
Complexity Digest
January 22, 10:42 AM
|
Ori Livson, Siddharth Pritam, Mikhail Prokopenko Preference cycles are prevalent in problems of decision-making, and are contradictory when preferences are assumed to be transitive. This contradiction underlies Condorcet's Paradox, a pioneering result of Social Choice Theory, wherein intuitive and seemingly desirable constraints on decision-making necessarily lead to contradictory preference cycles. Topological methods have since broadened Social Choice Theory and elucidated existing results. However, characterisations of preference cycles in Topological Social Choice Theory are lacking. In this paper, we address this gap by introducing a framework for topologically modelling preference cycles that generalises Baryshnikov's existing topological model of strict, ordinal preferences on 3 alternatives. In our framework, the contradiction underlying Condorcet's Paradox topologically corresponds to the non-orientability of a surface homeomorphic to either the Klein Bottle or Real Projective Plane, depending on how preference cycles are represented. These findings allow us to reduce Arrow's Impossibility Theorem to a statement about the orientability of a surface. Furthermore, these results contribute to existing wide-ranging interest in the relationship between non-orientability, impossibility phenomena in Economics, and logical paradoxes more broadly. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
January 22, 6:44 AM
|
María Fernanda Sánchez-Puig, Carlos Gershenson, Carlos Pineda The large digital archives of the American Physical Society (APS) offer an opportunity to quantitatively analyze the structure and evolution of scientific communication. In this paper, we perform a comparative analysis of the language used in eight APS journals (Phys. Rev. A, B, C, D, E, Lett., X, Rev. Mod. Phys.) using methods from statistical linguistics. We study word rank distributions (from monograms to hexagrams), finding that they are consistent with Zipf’s law. We also analyze rank diversity over time, which follows a characteristic sigmoid shape. To quantify the linguistic similarity between journals, we use the rank-biased overlap (RBO) distance, comparing the journals not only to each other, but also to corpora from Google Books and Twitter. This analysis reveals that the most significant differences emerge when focusing on content words rather than the full vocabulary. By identifying the unique and common content words for each specialized journal, we develop an article classifier that predicts a paper’s journal of origin based on its unique word distribution. This classifier uses a proposed “importance factor” to weigh the significance of each word. Finally, we analyze the frequency of mention of prominent physicists and compare it to their cultural recognitions ranked in the Pantheon dataset, finding a low correlation that highlights the context-dependent nature of scientific fame. These results demonstrate that scientific language itself can serve as a quantitative window into the organization and evolution of science. Read the full article at: www.preprints.org
|
Scooped by
Complexity Digest
January 11, 11:07 AM
|
Andrea Roli, Sudip Patra, Stuart Kauffman Interface Focus (2025) 15 (6): 20250038 We discuss the creation of information in the evolution of the biosphere by elaborating on the interplay between affordances and constraints. We maintain that information is created when affordances are seized and, therefore, at the same time, meaning is generated and a new space of possibilities is created. Read the full article at: royalsocietypublishing.org
|
Scooped by
Complexity Digest
January 9, 3:30 PM
|
David Wolpert, Carlo Rovelli, and Jordan Scharnhorst Entropy 2025, 27(12), 1227 Are your perceptions, memories and observations merely a statistical fluctuation arising from of the thermal equilibrium of the universe, bearing no correlation to the actual past state of the universe? Arguments are given in the literature for and against this “Boltzmann brain” hypothesis. Complicating these arguments have been the many subtle—and very often implicit—joint dependencies among these arguments and others that have been given for the past hypothesis, the second law, and even for Bayesian inference of the reliability of experimental data. These dependencies can easily lead to circular reasoning. To avoid this problem, since all of these arguments involve the stochastic properties of the dynamics of the universe’s entropy, we begin by formalizing that dynamics as a time-symmetric, time-translation invariant Markov process, which we call the entropy conjecture. Crucially, like all stochastic processes, the entropy conjecture does not specify any time(s) which it should be conditioned on in order to infer the stochastic dynamics of our universe’s entropy. Any such choice of conditioning times and associated entropy values must be introduced as an independent assumption. This observation allows us to disentangle the standard Boltzmann brain hypothesis, its “1000CE” variant, the past hypothesis, the second law, and the reliability of our experimental data, all in a fully formal manner. In particular, we show that these all make an arbitrary assumption that the dynamics of the universe’s entropy should be conditioned on a single event at a single moment in time, differing only in the details of their assumptions. In this aspect, the Boltzmann brain hypothesis and the second law are equally legitimate (or not). Read the full article at: www.mdpi.com
|
Scooped by
Complexity Digest
January 8, 1:13 PM
|
Jasper J. van Beers, Marten Scheffer, Prashant Solanki, Ingrid A. van de Leemput, Egbert H. van Nes, Coen C. de Visser Maintaining stability in feedback systems, from aircraft and autonomous robots to biological and physiological systems, relies on monitoring their behavior and continuously adjusting their inputs. Incremental damage can make such control fragile. This tends to go unnoticed until a small perturbation induces instability (i.e. loss of control). Traditional methods in the field of engineering rely on accurate system models to compute a safe set of operating instructions, which become invalid when the, possibly damaged, system diverges from its model. Here we demonstrate that the approach of such a feedback system towards instability can nonetheless be monitored through dynamical indicators of resilience. This holistic system safety monitor does not rely on a system model and is based on the generic phenomenon of critical slowing down, shown to occur in the climate, biology and other complex nonlinear systems approaching criticality. Our findings for engineered devices opens up a wide range of applications involving real-time early warning systems as well as an empirical guidance of resilient system design exploration, or "tinkering". While we demonstrate the validity using drones, the generic nature of the underlying principles suggest that these indicators could apply across a wider class of controlled systems including reactors, aircraft, and self-driving cars. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
January 8, 4:52 AM
|
Leroy Cronin, Sara I. Walker Assembly theory (AT) introduces a concept of causation as a material property, constitutive of a metrology of evolution and selection. The physical scale for causation is quantified with the assembly index, defined as the minimum number of steps necessary for a distinguishable object to exist, where steps are assembled recursively. Observing countable copies of high assembly index objects indicates that a mechanism to produce them is persistent, such that the object's environment builds a memory that traps causation within a contingent chain. Copy number and assembly index underlie the standardized metrology for detecting causation (assembly index), and evidence of contingency (copy number). Together, these allow the precise definition of a selective threshold in assembly space, understood as the set of all causal possibilities. This threshold demarcates life (and its derivative agential, intelligent and technological forms) as structures with persistent copies beyond the threshold. In introducing a fundamental concept of material causation to explain and measure life, AT represents a departure from prior theories of causation, such as interventional ones, which have so far proven incompatible with fundamental physics. We discuss how AT's concept of causation provides the foundation for a theory of physics where novelty, contingency and the potential for open-endedness are fundamental, and determinism is emergent along assembled lineages. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
December 28, 2025 10:01 AM
|
Manlio De Domenico The possibility that evolutionary forces -- together with a few fundamental factors such as thermodynamic constraints, specific computational features enabling information processing, and ecological processes -- might constrain the logic of living systems is tantalizing. However, it is often overlooked that any practical implementation of such a logic requires complementary circuitry that, in biological systems, happens through complex networks of genetic regulation, metabolic reactions, cellular signalling, communication, social and eusocial non-trivial organization. Here, we review and discuss how circuitries are not merely passive structures, but active agents of change that, by means of hierarchical and modular organization, are able to enhance and catalyze the evolution of evolvability. By analyzing the role of non-trivial topologies in major evolutionary transitions under the lens of statistical physics and nonlinear dynamics, we show that biological innovations are strictly related to circuitry and its deviation from trivial structures and (thermo)dynamic equilibria. We argue that sparse heterogeneous networks such as hierarchical modular, which are ubiquitously observed in nature, are favored in terms of the trade-off between energetic costs for redundancy, error-correction and mantainance. We identify three main features -- namely, interconnectivity, plasticity and interdependency -- pointing towards a unifying framework for modeling the phenomenology, discussing them in terms of dynamical systems theory, non-equilibrium thermodynamics and evolutionary dynamics. Within this unified picture, we also show that “slow” evolutionary dynamics is an emergent phenomenon governed by the replicator-mutator equation as the direct consequence of a constrained variational nonequilibrium process. Overall, this work highlights how dynamical systems theory and nonequilibrium thermodynamics provide powerful analytical techniques to study biological complexity. Read the full article at: iopscience.iop.org
|
Scooped by
Complexity Digest
December 26, 2025 9:38 AM
|
David H Wolpert Journal of Physics: Complexity, Volume 6, Number 4 The simulation hypothesis has recently excited renewed interest in the physics and philosophy communities. However, the hypothesis specifically concerns computers that simulate physical universes. So to formally investigate the hypothesis, we need to understand it in terms of computer science (CS) theory. In addition we need a formal way to couple CS theory with physics. Here I couple those fields by using the physical Church–Turing thesis. This allow me to exploit Kleene’s second recursion, to prove that not only is it possible for us to be a simulation being run on a computer, but that we might be in a simulation that is being run on a computer – by us. In such a ‘self-simulation’, there would be two identical instances of us, both equally ‘real’. I then use Rice’s theorem to derive impossibility results concerning simulation and self-simulation; derive implications for (self-)simulation if we are being simulated in a program using fully homomorphic encryption; and briefly investigate the graphical structure of universes simulating other universes which contain computers running their own simulations. I end by describing some of the possible avenues for future research. While motivated in terms of the simulation hypothesis, the results in this paper are direct consequences of the Church–Turing thesis. So they apply far more broadly than the simulation hypothesis. Read the full article at: iopscience.iop.org
|
Scooped by
Complexity Digest
December 25, 2025 10:03 AM
|
Sergi Valverde, Blai Vidiella, Salva Duran-Nebreda This chapter investigates the evolutionary ecology of software, focusing on the symbiotic relationship between software and innovation. An interplay between constraints, tinkering, and frequency-dependent selection drives the complex evolutionary trajectories of these socio-technological systems. Our approach integrates agent-based modeling and case studies, drawing on complex network analysis and evolutionary theory to explore how software evolves under the competing forces of novelty generation and imitation. By examining the evolution of programming languages and their impact on developer practices, we illustrate how technological artifacts co-evolve with and shape societal norms, cultural dynamics, and human interactions. This ecological perspective also informs our analysis of the emerging role of AI-driven development tools in software evolution. While large language models (LLMs) provide unprecedented access to information, their widespread adoption introduces new evolutionary pressures that may contribute to cultural stagnation, much like the decline of diversity in past software ecosystems. Understanding the evolutionary pressures introduced by AI-mediated software production is critical for anticipating broader patterns of cultural change, technological adaptation, and the future of software innovation. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
December 23, 2025 1:43 PM
|
Richard A. Watson, Michael Levin, Tim Lewens` Interface Focus (2025) 15 (6): 20250025 . It is conventionally assumed that all evolutionary adaptation is produced, and could only possibly be produced, by natural selection. Natural induction is a different mechanism of adaptation. It occurs in dynamical systems described by a network of interactions, where connections give way slightly under stress and the system is subject to occasional perturbations. This differential adjustment of connections causes reorganization of the system’s internal structure in a manner equivalent to associative learning familiar in neural networks. This is sufficient for storage and recall of multiple patterns, learning with generalization and solving difficult constraint problems (without any natural selection involved). Various biological systems (from gene-regulation networks to metabolic networks to ecosystems) meet these basic conditions and therefore have potential to exhibit adaptation by natural induction. Here (and in a follow-on paper), we consider various ways that natural induction and natural selection might interact in biological evolution. For example, in some cases, natural selection may act not as a source of adaptations but as a memory of adaptations discovered by natural induction. We conclude that evolution by natural induction is a viable process that expands our understanding of evolutionary adaptation. Read the full article at: royalsocietypublishing.org
|
Scooped by
Complexity Digest
December 7, 2025 6:40 AM
|
Alberto Aleta, Andreia Sofia Teixeira, Guilherme Ferraz de Arruda, Andrea Baronchelli, Alain Barrat, János Kertész, Albert Díaz-Guilera, Oriol Artime, Michele Starnini, Giovanni Petri, Márton Karsai, Siddharth Patwardhan, Alessandro Vespignani, Yamir Moreno, Santo Fortunato Multilayer network science has emerged as a central framework for analysing interconnected and interdependent complex systems. Its relevance has grown substantially with the increasing availability of rich, heterogeneous data, which makes it possible to uncover and exploit the inherently multilayered organisation of many real-world networks. In this review, we summarise recent developments in the field. On the theoretical and methodological front, we outline core concepts and survey advances in community detection, dynamical processes, temporal networks, higher-order interactions, and machine-learning-based approaches. On the application side, we discuss progress across diverse domains, including interdependent infrastructures, spreading dynamics, computational social science, economic and financial systems, ecological and climate networks, science-of-science studies, network medicine, and network neuroscience. We conclude with a forward-looking perspective, emphasizing the need for standardized datasets and software, deeper integration of temporal and higher-order structures, and a transition toward genuinely predictive models of complex systems. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
December 5, 2025 5:24 PM
|
Mohsen Raoufi, Heiko Hamann & Pawel Romanczuk npj Complexity volume 2, Article number: 28 (2025) Collective estimation is a variant of collective decision-making where agents reach consensus on a continuous quantity through social interactions. Achieving precise consensus is complex due to the co-evolution of opinions and the interaction network. While homophilic networks may facilitate estimation in well-connected systems, disproportionate interactions with like-minded neighbors lead to the emergence of echo chambers and prevent consensus. Our agent-based simulations confirm that, besides limited exposure to attitude-challenging opinions, seeking reaffirming information entrap agents in echo chambers. To overcome this, agents can adopt a stubborn state (Messengers) that carries data and connects clusters by physically transporting their opinion. We propose a generic approach based on a Dichotomous Markov Process, which governs probabilistic switching between behavioral states and generates diverse collective behaviors. We study a continuum between task specialization (no switching), to generalization (slow or rapid switching). Messengers help the collective escape local minima, break echo chambers, and promote consensus. Read the full article at: www.nature.com
|
Suggested by
PJ Lamberson
December 1, 2025 2:16 PM
|
Gülşah Akçakır, John C. Lang & P. J. Lamberson npj Complexity volume 2, Article number: 35 (2025) Collaboration enables groups to solve problems beyond the reach of their individual members in contexts ranging from research and development to high-energy physics. While communication networks play a pivotal role in group success, there is a longstanding debate on the optimal network topology for solving complex problems. Prior research reaches contradictory conclusions–some studies suggest networks that slow information transmission help maintain diversity, leading groups to explore more of the problem space and find better solutions in the long run, while others argue that networks that maximize communication efficiency allow groups to exploit known solutions, boosting overall performance. Many existing models assume that individuals use their network connections only to copy better-performing group members, but we show that such groups often perform worse than if individuals worked independently. Instead, our model introduces a crucial distinction: in addition to copying, individuals can actively collaborate, leveraging diverse perspectives to uncover solutions that would otherwise remain inaccessible. Our findings reveal that the optimal network structure depends on the balance between copying and collaboration. When copying dominates, inefficient, exploration-focused networks lead to better outcomes. However, when individuals primarily collaborate, highly connected, efficient networks win out. We also show how groups can reap the benefits of both strategies by employing a collaborate first-copy later heuristic in highly connected networks. The results offer new insights into how organizations should be structured to maximize problem-solving performance across different contexts. Read the full article at: www.nature.com
|
|
Scooped by
Complexity Digest
January 22, 8:45 AM
|
Sara Najem, Amer E. Mouawad Determining whether two graphs are structurally identical is a fundamental problem with applications spanning mathematics, computer science, chemistry, and network science. Despite decades of study, graph isomorphism remains a challenging algorithmic task, particularly for highly symmetric structures. Here we introduce a new algorithmic approach based on ideas from spectral graph theory and geometry that constructs candidate correspondences between vertices using their curvatures. Any correspondence produced by the algorithm is explicitly verified, ensuring that non-isomorphic graphs are never incorrectly identified as isomorphic. Although the method does not yet guarantee success on all isomorphic inputs, we find that it correctly resolves every instance tested in deterministic polynomial time, including a broad collection of graphs known to be difficult for classical spectral techniques. These results demonstrate that enriched spectral methods can be far more powerful than previously understood, and suggest a promising direction for the practical resolution of the complexity of the graph isomorphism problem. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
January 22, 6:39 AM
|
Galen J. Wilkerson Understanding how network structure constrains and enables information processing is a central problem in the statistical mechanics of interacting systems. Here we study random networks across the structural percolation transition and analyze how connectivity governs realizable input-output transformations under cascade dynamics. Using Erdos-Renyi networks as a minimal ensemble, we examine structural, functional, and information-theoretic observables as functions of mean degree. We find that the emergence of the giant connected component coincides with a sharp transition in realizable information processing: complex input-output response functions become accessible, functional diversity increases rapidly, output entropy rises, and directed information flow, quantified by transfer entropy, extends beyond local neighborhoods. We term this coincidence of structural, functional, and informational transitions functional percolation, referring to a sharp expansion of the space of realizable input-output functions at the percolation threshold. Near criticality, networks exhibit a Pareto-optimal tradeoff between functional complexity and diversity, suggesting that percolation criticality may provide a general organizing principle of information processing capacity in systems with local interactions and propagating influences. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
January 10, 11:05 AM
|
Marlene C. L. Batzke, Peter Steiglechner, Jan Lorenz, Bruce Edmonds, František Kalvas Political Psychology Political polarization represents a rising issue in many countries, making it more and more important to understand its relation to cognitive-motivational and social influence mechanisms. Yet, the link between micro-level mechanisms and macro-level phenomena remains unclear. We investigated the consequences of individuals striving for cognitive coherence in their belief systems on political polarization in society in an agent-based model. In this, we formalized how cognitive coherence affects how individuals update their beliefs following social influence and self-reflection processes. We derive agents' political beliefs as well as their subjective belief systems, defining what determines coherence for different individuals, from European Social Survey data via correlational class analysis. The simulation shows that agents polarize in their beliefs when they have a strong strive for cognitive coherence, and especially when they have structurally different belief systems. In a mathematical analysis, we not only explain the main findings but also underscore the necessity of simulations for understanding the complex dynamics of socially embedded phenomena such as political polarization. Read the full article at: onlinelibrary.wiley.com
|
Scooped by
Complexity Digest
January 9, 11:04 AM
|
James N. Druckman, Katherine Ognyanova, Alauna Safarpour, Jonathan Schulman, Kristin Lunz Trujillo, Ata Aydin Uslu, Jon Green, Matthew A. Baum, Alexi Quintana-Mathé, Hong Qu, Roy H. Perlis & David M. J. Lazer Nature Human Behaviour (2025) Scientists provide important information to the public. Whether that information influences decision-making depends on trust. In the USA, gaps in trust in scientists have been stable for 50 years: women, Black people, rural residents, religious people, less educated people and people with lower economic status express less trust than their counterparts (who are more represented among scientists). Here we probe the factors that influence trust. We find that members of the less trusting groups exhibit greater trust in scientists who share their characteristics (for example, women trust women scientists more than men scientists). They view such scientists as having more benevolence and, in most cases, more integrity. In contrast, those from high-trusting groups appear mostly indifferent about scientists’ characteristics. Our results highlight how increasing the presence of underrepresented groups among scientists can increase trust. This means expanding representation across several divides—not just gender and race/ethnicity but also rurality and economic status. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
January 8, 11:03 AM
|
Aparimit Kasliwal, Abdullah Alhadlaq, Ariel Salgado, Auroop R. Ganguly, Marta C. González Computer-Aided Civil and Infrastructure Engineering Volume40, Issue31, 29 December 2025, Pages 6223-6241 Modeling spreading dynamics on spatial networks is crucial to addressing challenges related to traffic congestion, epidemic outbreaks, efficient information dissemination, and technology adoption. Existing approaches include domain-specific agent-based simulations, which offer detailed dynamics but often involve extensive parameterization, and simplified differential equation models, which provide analytical tractability but may abstract away spatial heterogeneity in propagation patterns. As a step toward addressing this trade-off, this work presents a hierarchical multiscale framework that approximates spreading dynamics across different spatial scales under certain simplifying assumptions. Applied to the Susceptible-Infected-Recovered (SIR) model, the approach ensures consistency in dynamics across scales through multiscale regularization, linking parameters at finer scales to those obtained at coarser scales. This approach constrains the parameter search space, and enables faster convergence of the model fitting process compared to the non-regularized model. Using hierarchical modeling, the spatial dependencies critical for understanding system-level behavior are captured while mitigating the computational challenges posed by parameter proliferation at finer scales. Considering traffic congestion and COVID-19 spread as case studies, the calibrated fine-scale model is employed to analyze the effects of perturbations and to identify critical regions and connections that disproportionately influence system dynamics. This facilitates targeted intervention strategies and provides a tool for studying and managing spreading processes in spatially distributed sociotechnical systems. Read the full article at: onlinelibrary.wiley.com
|
Scooped by
Complexity Digest
January 7, 11:01 AM
|
Stefano Caselli, Marta Zava The study examines the structure, functioning, and strategic implications of financial ecosystems across four European countries-France, Sweden, the United Kingdom, and Italy-to identify institutional best practices relevant to the ongoing transformation of Italy's financial system. Building on a comparative analysis of legislation and regulation, taxation, investor bases, and financial intermediation, the report highlights how distinct historical and institutional trajectories have shaped divergent models: the French dirigiste system anchored by powerful state-backed institutions and deep asset management pools; the Swedish social-democratic ecosystem driven by broad household equity participation, taxefficient savings vehicles, and equity-oriented pension funds; and the British liberal model, characterized by deep capital markets, strong institutional investor engagement, and globally competitive listing infrastructure. In contrast, Italy remains predominantly bank-centric, with fragmented institutional investment, limited retail equity participation, underdeveloped public markets, and a structural reliance on domestic banking channels for corporate finance. Read the full article at: papers.ssrn.com
|
Scooped by
Complexity Digest
December 27, 2025 8:04 AM
|
Federico Battiston, Valerio Capraro, Fariba Karimi, Sune Lehmann, Andrea Bamberg Migliano, Onkar Sadekar, Angel Sánchez & Matjaž Perc Nature Human Behaviour volume 9, pages 2441–2457 (2025 Traditional social network models focus on pairwise interactions, overlooking the complexity of group-level dynamics that shape collective human behaviour. Here we outline how the framework of higher-order social networks—using mathematical representations beyond simple graphs—can more accurately represent interactions involving multiple individuals. Drawing from empirical data including scientific collaborations and contact networks, we demonstrate how higher-order structures reveal mechanisms of group formation, social contagion, cooperation and moral behaviour that are invisible in dyadic models. By moving beyond dyads, this approach offers a transformative lens for understanding the relational architecture of human societies, opening new directions for behavioural experiments, cultural dynamics, team science and group behaviour as well as new cross-disciplinary research. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
December 25, 2025 2:05 PM
|
Stuart Bartlett; Michael L. Wong Interface Focus (2025) 15 (6): 20250019 . Learning—in addition to thermodynamic dissipation, autocatalysis and homeostasis—has been hypothesized to be a key pillar of all living systems. Here, we examine the myriad ways in which organisms on Earth learn over various time and length scales—from Darwinian evolution to protein computation to the scientific method—in order to draw abstractions about the process of learning in general. Be it in life on Earth or lyfe elsewhere in the universe, we propose that learning can be characterized by a combination of mechanisms that favour functional fitness and those that favour novelty search. We also propose that feedbacks related to learning and dissipation, learning and environmental complexity and learning and self-modelling may be general features that guide how the information-processing and predictive abilities of learning systems evolve with time, perhaps even at the scale of planetary biospheres. Read the full article at: royalsocietypublishing.org
|
Scooped by
Complexity Digest
December 24, 2025 10:05 AM
|
Haoling Zhang, Chao-Han Huck Yang, Hector Zenil, Pin-Yu Chen, Yue Shen, Narsis A. Kiani & Jesper N. Tegnér Nature Communications , Article number: (2025) As the scale of artificial neural networks continues to expand to tackle increasingly complex tasks or improve the prediction accuracy of specific tasks, the challenges associated with computational demand, hyper-parameter tuning, model interpretability, and deployment costs intensify. Addressing these challenges requires a deeper understanding of how network structures influence network performance. Here, we analyse 882,000 motifs to reveal the functional roles of incoherent and coherent three-node motifs in shaping overall network performance. Our findings reveal that incoherent loops exhibit superior representational capacity and numerical stability, whereas coherent loops show a distinct preference for high-gradient regions within the output landscape. By avoiding such gradient pursuit, incoherent loops sustain more stable adaptation and consequently greater robustness. This mechanism is evident in 97,240 fixed-network training experiments, where coherent-loop networks consistently prioritized high-gradient regions during learning, and is further supported by noise-resilience analyses – from classical reinforcement learning tasks to biological, chemical, and medical applications – which demonstrate that incoherent-loop networks maintain stronger resistance to training noise and environmental perturbations. This work shows the functional impact of structural motif differences on the performance of artificial neural networks, offering foundational insights for designing more resilient and accurate networks. Read the full article at: www.nature.com
|
Scooped by
Complexity Digest
December 23, 2025 11:44 AM
|
Amahury J. López-Díaz, Pedro Juan Rivera Torres, Gerardo L. Febres, Carlos Gershenson Discrete dynamical models underpin systems biology, but we still lack substrate-agnostic diagnostics for when such models can sustain genuinely open-ended evolution (OEE): the continual production of novel phenotypes rather than eventual settling. We introduce a simple, model-independent metric, {\Omega}, that quantifies OEE as the residence-time-weighted contribution of each attractor's cycle length across the sequence of attractors realized over time. {\Omega} is zero for single-attractor dynamics and grows with the number and persistence of distinct cyclic phenotypes, separating enduring innovation from transient noise. Using Random Boolean Networks (RBNs) as a unifying testbed, we compare classical Boolean dynamics with biologically motivated non-classical mechanisms (probabilistic context switching, annealed rule mutation, paraconsistent logic, modal necessary/possible gating, and quantum-inspired superposition/entanglement) under homogeneous and heterogeneous updating schemes. Our results support the view that undecidability-adjacent, state-dependent mechanisms -- implemented as contextual switching, conditional necessity/possibility, controlled contradictions, or correlated branching -- are enabling conditions for sustained novelty. At the end of our manuscript we outline a practical extension of {\Omega} to continuous/hybrid state spaces, positioning {\Omega} as a portable benchmark for OEE in discrete biological modeling and a guide for engineering evolvable synthetic circuits. Read the full article at: arxiv.org
|
Scooped by
Complexity Digest
December 6, 2025 5:27 PM
|
Guest Editors: Thiago B. Murari, Marcelo A. Moret, Hernane B. de B. Pereira, Tarcísio M. Rocha Filho, José F. F. Mendes, Tiziana Di Matteo Inspired by the Conference on Complex Systems 2023 (CCS2023) in Salvador, Brazil, this collection of EPJ B brings together 25 peer-reviewed articles covering a wide range of topics. This collection highlights the interdisciplinary nature of the field, with contributions from physics, biology, economics, linguistics, and artificial intelligence, and serves as a reference for researchers addressing real-world challenges through systems-based thinking. Read the full issue at: epjb.epj.org
|
Scooped by
Complexity Digest
December 4, 2025 7:17 PM
|
Michael Lissack This paper presents a unified theoretical framework that reconciles four apparently disparate approaches: Quantum Bayesianism (QBism), Robert Rosen's theory of Anticipatory Systems, the causal bubbles interpretation of quantum mechanics, and pragmatic constructivism through Hans Vaihinger's philosophy of 'as if.' We demonstrate that these frameworks converge on a fundamental insight: reality emerges from a relational causal structure-the pattern of influences that determine what can affect what-rather than from external observation. The QBist agent exemplifies a Rosen Anticipatory System operating within a causal bubble, wherein the quantum wave function serves as a heuristic fiction-an 'as if' construct-used for anticipatory modeling within the agent's architecture rather than for ontological description. This synthesis resolves longstanding quantum paradoxes, provides a naturalized account of final causality, and extends to encompass human cognition and artificial intelligence as distinct instantiations of the same anticipatory pattern. We argue that physical laws function as normative standards for coherent anticipation that acquire constraining force through selective pressure, and that this relational ontology bridges quantum physics, theoretical biology, epistemology, and cognitive science, dissolving apparent conflicts between these domains into perspectives on a shared structure. Read the full article at: papers.ssrn.com
|