Papers
614.4K views | +12 today
Follow
Papers
Recent publications related to complex systems
Your new post is loading...
Your new post is loading...
Scooped by Complexity Digest
July 8, 4:00 AM
Scoop.it!

COSMOS MIND AND MATTER: Is Mind in Spacetime?

Stuart Kauffman, Sudip Patra

BioSystems

We attempt in this article to formulate a conceptual and testable framework weaving Cosmos, Mind and Matter into a whole. We build on three recent discoveries, each requiring more evidence: i. The particles of the Standard Model, SU(3) x SU(2) x U(1), are formally capable of collective autocatalysis. This leads us to ask what roles such autocatalysis may have played in Cosmogenesis, and in trying to answer, Why our Laws? Why our Constants? A capacity of the particles of SU(3) x SU(2) x U(1) for collective autocatalysis may be open to experimental test, stunning if confirmed. ii. Reasonable evidence now suggests that matter can expand spacetime. The first issue is to establish this claim at or beyond 5 sigma if that can be done. If true, this process may elucidate Dark Matter, Dark Energy and Inflation and require alteration of Einstein’s Field Equations. Cosmology would be transformed. iii. Evidence at 6.49 Sigma suggests that mind can alter the outcome of the two-slit experiment. If widely and independently verified, the foundations of quantum mechanics must be altered. Mind plays a role in the universe. That role may include Cosmic Mind.

Read the full article at: www.sciencedirect.com

No comment yet.
Scooped by Complexity Digest
July 6, 2:42 PM
Scoop.it!

Conscious artificial intelligence and biological naturalism

Anil Seth

As artificial intelligence (AI) continues to develop, it is natural to ask whether AI systems can be not only intelligent, but also conscious. I consider why some people think AI might develop consciousness, identifying some biases that lead us astray. I ask what it would take for conscious AI to be a realistic prospect, pushing back against some common assumptions such as the notion that computation provides a sufficient basis for consciousness. I’ll instead make the case for taking seriously the possibility that consciousness might depend on our nature as living organisms – a form of biological naturalism. I will end by exploring some wider issues including testing for consciousness in AI, and ethical considerations arising from AI that either actually is, or convincingly seems to be, conscious.

Read the full article at: osf.io

No comment yet.
Scooped by Complexity Digest
July 4, 6:51 PM
Scoop.it!

Evolutionary Implications of Self-Assembling Cybernetic Materials with Collective Problem-Solving Intelligence at Multiple Scales

Hartl, B.; Risi, S.; Levin, M.

Entropy 2024, 26, 532

In recent years, the scientific community has increasingly recognized the complex multi-scale competency architecture (MCA) of biology, comprising nested layers of active homeostatic agents, each forming the self-orchestrated substrate for the layer above, and, in turn, relying on the structural and functional plasticity of the layer(s) below. The question of how natural selection could give rise to this MCA has been the focus of intense research. Here, we instead investigate the effects of such decision-making competencies of MCA agential components on the process of evolution itself, using in silico neuroevolution experiments of simulated, minimal developmental biology. We specifically model the process of morphogenesis with neural cellular automata (NCAs) and utilize an evolutionary algorithm to optimize the corresponding model parameters with the objective of collectively self-assembling a two-dimensional spatial target pattern (reliable morphogenesis). Furthermore, we systematically vary the accuracy with which the uni-cellular agents of an NCA can regulate their cell states (simulating stochastic processes and noise during development). This allows us to continuously scale the agents’ competency levels from a direct encoding scheme (no competency) to an MCA (with perfect reliability in cell decision executions). We demonstrate that an evolutionary process proceeds much more rapidly when evolving the functional parameters of an MCA compared to evolving the target pattern directly. Moreover, the evolved MCAs generalize well toward system parameter changes and even modified objective functions of the evolutionary process. Thus, the adaptive problem-solving competencies of the agential parts in our NCA-based in silico morphogenesis model strongly affect the evolutionary process, suggesting significant functional implications of the near-ubiquitous competency seen in living matter.

Read the full article at: www.mdpi.com

No comment yet.
Scooped by Complexity Digest
July 3, 4:49 PM
Scoop.it!

Minimalist exploration strategies for robot swarms at the edge of chaos

Vinicius Sartorio, Luigi Feola, Emanuel Estrada, Vito Trianni, Jonata Tyska Carvalho

Effective exploration abilities are fundamental for robot swarms, especially when small, inexpensive robots are employed (e.g., micro- or nano-robots). Random walks are often the only viable choice if robots are too constrained regarding sensors and computation to implement state-of-the-art solutions. However, identifying the best random walk parameterisation may not be trivial. Additionally, variability among robots in terms of motion abilities-a very common condition when precise calibration is not possible-introduces the need for flexible solutions. This study explores how random walks that present chaotic or edge-of-chaos dynamics can be generated. We also evaluate their effectiveness for a simple exploration task performed by a swarm of simulated Kilobots. First, we show how Random Boolean Networks can be used as controllers for the Kilobots, achieving a significant performance improvement compared to the best parameterisation of a Lévy-modulated Correlated Random Walk. Second, we demonstrate how chaotic dynamics are beneficial to maximise exploration effectiveness. Finally, we demonstrate how the exploration behavior produced by Boolean Networks can be optimized through an Evolutionary Robotics approach while maintaining the chaotic dynamics of the networks.

Read the full article at: arxiv.org

No comment yet.
Scooped by Complexity Digest
July 1, 8:34 AM
Scoop.it!

Impact of navigation apps on congestion and spread dynamics on a transportation network

Alben Rome Bagabaldo, Qianxin Gan, Alexandre M. Bayen & Marta C. González

Data Science for Transportation Volume 6, article number 12, (2024)

In recent years, the widespread adoption of navigation apps by motorists has raised questions about their impact on local traffic patterns. Users increasingly rely on these apps to find better, real-time routes to minimize travel time. This study uses microscopic traffic simulations to examine the connection between navigation app use and traffic congestion. The research incorporates both static and dynamic routing to model user behavior. Dynamic routing represents motorists who actively adjust their routes based on app guidance during trips, while static routing models users who stick to known fastest paths. Key traffic metrics, including flow, density, speed, travel time, delay time, and queue lengths, are assessed to evaluate the outcomes. Additionally, we explore congestion propagation at various levels of navigation app adoption. To understand congestion dynamics, we apply a susceptible–infected–recovered (SIR) model, commonly used in disease spread studies. Our findings reveal that traffic system performance improves when 30–60% of users follow dynamic routing. The SIR model supports these findings, highlighting the most efficient congestion propagation-to-dissipation ratio when 40% of users adopt dynamic routing, as indicated by the lowest basic reproductive number. This research provides valuable insights into the intricate relationship between navigation apps and traffic congestion, with implications for transportation planning and management.

Read the full article at: link.springer.com

Scooped by Complexity Digest
June 29, 12:04 PM
Scoop.it!

Laplacian Renormalization Group: An introduction to heterogeneous coarse-graining

Guido Caldarelli, Andrea Gabrielli, Tommaso Gili, Pablo Villegas

The renormalization group (RG) constitutes a fundamental framework in modern theoretical physics. It allows the study of many systems showing states with large-scale correlations and their classification in a relatively small set of universality classes. RG is the most powerful tool for investigating organizational scales within dynamic systems. However, the application of RG techniques to complex networks has presented significant challenges, primarily due to the intricate interplay of correlations on multiple scales. Existing approaches have relied on hypotheses involving hidden geometries and based on embedding complex networks into hidden metric spaces. Here, we present a practical overview of the recently introduced Laplacian Renormalization Group for heterogeneous networks. First, we present a brief overview that justifies the use of the Laplacian as a natural extension for well-known field theories to analyze spatial disorder. We then draw an analogy to traditional real-space renormalization group procedures, explaining how the LRG generalizes the concept of "Kadanoff supernodes" as block nodes that span multiple scales. These supernodes help mitigate the effects of cross-scale correlations due to small-world properties. Additionally, we rigorously define the LRG procedure in momentum space in the spirit of Wilson RG. Finally, we show different analyses for the evolution of network properties along the LRG flow following structural changes when the network is properly reduced.

Read the full article at: arxiv.org

No comment yet.
Scooped by Complexity Digest
June 28, 8:10 PM
Scoop.it!

Assembly Theory and its Relationship with Computational Complexity

Christopher Kempes, Sara I. Walker, Michael Lachmann, Leroy Cronin

Assembly theory (AT) quantifies selection using the assembly equation and identifies complex objects that occur in abundance based on two measurements, assembly index and copy number. The assembly index is determined by the minimal number of recursive joining operations necessary to construct an object from basic parts, and the copy number is how many of the given object(s) are observed. Together these allow defining a quantity, called Assembly, which captures the amount of causation required to produce the observed objects in the sample. AT's focus on how selection generates complexity offers a distinct approach to that of computational complexity theory which focuses on minimum descriptions via compressibility. To explore formal differences between the two approaches, we show several simple and explicit mathematical examples demonstrating that the assembly index, itself only one piece of the theoretical framework of AT, is formally not equivalent to other commonly used complexity measures from computer science and information theory including Huffman encoding and Lempel-Ziv-Welch compression.

Read the full article at: arxiv.org

No comment yet.
Scooped by Complexity Digest
June 28, 1:21 PM
Scoop.it!

Heinz von Foerster's operational epistemology: orientation for insight into complexity

Arantzazu Saratxaga Arregi
Kybernetes

Purpose

Based on the reception of the principle of self-organization, the core of Heinz von Foerster's operational theories, I hypothesize how Heinz von Foerster's theory can be an orientation model for the epistemological problem of complexity. I have chosen this study to demonstrate complexity as an epistemological problem. This is because the question of how order arises - the core problem of complexity - is an epistemological question for which Heinz von Foerster developed an epistemology of self-organization. I do not present new research because HvF already had the complex organization of systems in mind. Rather, I build a critical approach to complexity on the research and work on operational epistemology in HvF.

Design/methodology/approach

This article aims to provide an orientation for a philosophical and epistemological understanding of complexity through a reading of Heinz von Foerster's operational theory. The article attempts to establish complexity as an epistemological phenomenon through the following method: (1) a conceptual description of the science of complexity based on the turn to thermodynamic time, (2) a genealogy of complexity going back to the systemic method, and (3) Heinz von Foerster's cybernetic approach to self-organization.

Findings

Based on the reception of the principle of self-organization, the core of Heinz von Foerster's operational theories, the conclusion is drawn that complexity as a description is based on language games.

Research limitations/implications

The results present complexity not as an object of science, but as a description that stands for the understanding of complex description.

Social implications

The hypothesis that complexity is a question of description or observation, i.e. of description for what language serves, has enormous social implications, in that the description of complexes and the recognition of their orders (patterns) cannot be left to algorithmic governmentality, but must be carried out by a social agency.

Originality/value

HvF's operational epistemology can serve as an epistemological model for critical complexity theory.


Read the full article at: www.emerald.com

No comment yet.
Scooped by Complexity Digest
June 16, 2:52 PM
Scoop.it!

Fundamental Constraints to the Logic of Living Systems

Solé, R.; Kempes, C. P.; Corominas-Murtra, B.; De Domenico, M.; Kolchinsky, A.; Lachmann, M.; Libby, E.; Saavedra, S.; Smith, E.; Wolpert, D.

Preprints 2024, 2024060891

It has been argued that the historical nature of evolution makes it a highly path-dependent process. Under this view, the outcome of evolutionary dynamics could have resulted in organisms with different forms and functions. At the same time, there is ample evidence that convergence and constraints strongly limit the domain of the potential design principles that evolution can achieve. Are these limitations relevant in shaping the fabric of the possible? Here, we argue that fundamental constraints are associated with the logic of living matter. We illustrate this idea by considering the thermodynamic properties of living systems, the linear nature of molecular information, the cellular nature of the building blocks of life, multicellularity and development, the threshold nature of computations in cognitive systems, and the discrete nature of the architecture of ecosystems. In all these examples, we present available evidence and suggest potential avenues towards a well-defined theoretical formulation.

Read the full article at: www.preprints.org

No comment yet.
Scooped by Complexity Digest
June 15, 8:05 AM
Scoop.it!

Higher-order correlations reveal complex memory in temporal hypergraphs

Higher-order correlations reveal complex memory in temporal hypergraphs | Papers | Scoop.it

Luca Gallo, Lucas Lacasa, Vito Latora & Federico Battiston 
Nature Communications volume 15, Article number: 4754 (2024)

Many real-world complex systems are characterized by interactions in groups that change in time. Current temporal network approaches, however, are unable to describe group dynamics, as they are based on pairwise interactions only. Here, we use time-varying hypergraphs to describe such systems, and we introduce a framework based on higher-order correlations to characterize their temporal organization. The analysis of human interaction data reveals the existence of coherent and interdependent mesoscopic structures, thus capturing aggregation, fragmentation and nucleation processes in social systems. We introduce a model of temporal hypergraphs with non-Markovian group interactions, which reveals complex memory as a fundamental mechanism underlying the emerging pattern in the data.

Read the full article at: www.nature.com

No comment yet.
Scooped by Complexity Digest
June 12, 1:12 PM
Scoop.it!

Evidence Mounts That About 7% of US Adults Have Had Long COVID

Evidence Mounts That About 7% of US Adults Have Had Long COVID | Papers | Scoop.it

Zhengyi Fang; Rebecca Ahrnsbrak; Andy Rekito

JAMA Data Brief

New data from the Medical Expenditure Panel Survey (MEPS) Household Component support prior findings that about 7% of US adults have had post–COVID-19 condition, also known as long COVID. The household survey of the US civilian noninstitutionalized population, sponsored by the Agency for Healthcare Research and Quality, found that an estimated 6.9% of adults—17.8 million—had ever had long COVID as of early 2023.

This nationally representative survey included a sample of 17 418 adults aged 18 years or older, which corresponds to 259 million adults. A total of 8275 adults reported having had COVID-19, of which 1202 adults reported having had long COVID symptoms.

Read the full article at: jamanetwork.com

No comment yet.
Suggested by Fil Menczer
June 10, 8:59 AM
Scoop.it!

Anatomy of an AI-powered malicious social botnet

Anatomy of an AI-powered malicious social botnet | Papers | Scoop.it

Yang, K., & Menczer, F. (2024).

Journal of Quantitative Description: Digital Media 4

Large language models (LLMs) exhibit impressive capabilities in generating realistic text across diverse subjects. Concerns have been raised that they could be utilized to produce fake content with a deceptive intention, although evidence thus far remains anecdotal. This paper presents a case study about a Twitter botnet that appears to employ ChatGPT to generate human-like content. Through heuristics, we identify 1,140 accounts and validate them via manual annotation. These accounts form a dense cluster of fake personas that exhibit similar behaviors, including posting machine-generated content and stolen images, and engage with each other through replies and retweets. ChatGPT-generated content promotes suspicious websites and spreads harmful comments. While the accounts in the AI botnet can be detected through their coordination patterns, current state-of-the-art LLM content classifiers fail to discriminate between them and human accounts in the wild. These findings highlight the threats posed by AI-enabled social bots.

Read the full article at: journalqd.org

No comment yet.
Scooped by Complexity Digest
May 31, 10:21 AM
Scoop.it!

Self-Improvising Memory: A Perspective on Memories as Agential, Dynamically Reinterpreting Cognitive Glue

Self-Improvising Memory: A Perspective on Memories as Agential, Dynamically Reinterpreting Cognitive Glue | Papers | Scoop.it

Michael Levin

Entropy 2024, 26(6), 481

Many studies on memory emphasize the material substrate and mechanisms by which data can be stored and reliably read out. Here, I focus on complementary aspects: the need for agents to dynamically reinterpret and modify memories to suit their ever-changing selves and environment. Using examples from developmental biology, evolution, and synthetic bioengineering, in addition to neuroscience, I propose that a perspective on memory as preserving salience, not fidelity, is applicable to many phenomena on scales from cells to societies. Continuous commitment to creative, adaptive confabulation, from the molecular to the behavioral levels, is the answer to the persistence paradox as it applies to individuals and whole lineages. I also speculate that a substrate-independent, processual view of life and mind suggests that memories, as patterns in the excitable medium of cognitive systems, could be seen as active agents in the sense-making process. I explore a view of life as a diverse set of embodied perspectives—nested agents who interpret each other’s and their own past messages and actions as best as they can (polycomputation). This synthesis suggests unifying symmetries across scales and disciplines, which is of relevance to research programs in Diverse Intelligence and the engineering of novel embodied minds.

Read the full article at: www.mdpi.com

Alessandro Cerboni's curator insight, June 1, 3:47 PM
Molti studi sulla memoria enfatizzano il substrato materiale e i meccanismi attraverso i quali i dati possono essere archiviati e letti in modo affidabile. Qui mi concentro su aspetti complementari: la necessità per gli agenti di reinterpretare e modificare dinamicamente i ricordi per adattarli al loro sé e al loro ambiente in continua evoluzione. Utilizzando esempi tratti dalla biologia dello sviluppo, dall’evoluzione e dalla bioingegneria sintetica, oltre alla neuroscienza, propongo che una prospettiva sulla memoria come preservazione della salienza, non della fedeltà, sia applicabile a molti fenomeni su larga scala, dalle cellule alle società. L’impegno continuo nella confabulazione creativa e adattiva, dal livello molecolare a quello comportamentale, è la risposta al paradosso della persistenza in quanto si applica agli individui e a interi lignaggi. Immagino inoltre che una visione processuale e indipendente dal substrato della vita e della mente suggerisca che i ricordi, in quanto modelli nel mezzo eccitabile dei sistemi cognitivi, potrebbero essere visti come agenti attivi nel processo di creazione di senso. Esploro una visione della vita come un insieme diversificato di prospettive incarnate: agenti annidati che interpretano i messaggi e le azioni passati gli uni e gli altri nel miglior modo possibile (policomputazione). Questa sintesi suggerisce simmetrie unificanti tra scale e discipline, il che è rilevante per i programmi di ricerca sull’intelligenza diversa e sull’ingegneria di nuove menti incarnate.
Scooped by Complexity Digest
July 7, 2:46 PM
Scoop.it!

The development of ecological systems along paths of least resistance

Jie Deng, Otto X. Cordero, Tadashi Fukami, Simon A. Levin, Robert M. Pringle, Ricard Solé, Serguei Saavedra

A long-standing question in biology is whether there are common principles that characterize the development of ecological systems (the appearance of a group of taxa), regardless of organismal diversity and environmental context. Classic ecological theory holds that these systems develop following a sequenced orderly process that generally proceeds from fast-growing to slow-growing taxa and depends on life-history trade-offs. However, it is also possible that this developmental order is simply the path with the least environmental resistance for survival of the component species and hence favored by probability alone. Here, we use theory and data to show that the order from fast-to slow-growing taxa is the most likely developmental path for diverse systems when local taxon interactions self-organize to minimize environmental resistance. First, we demonstrate theoretically that a sequenced development is more likely than a simultaneous one, at least until the number of iterations becomes so large as to be ecologically implausible. We then show that greater diversity of taxa and life histories improves the likelihood of a sequenced order from fast-to slow-growing taxa. Using data from bacterial and metazoan systems, we present empirical evidence that the developmental order of ecological systems moves along the paths of least environmental resistance. The capacity of simple principles to explain the trend in the developmental order of diverse ecological systems paves the way to an enhanced understanding of the collective features characterizing the diversity of life.

Read the full article at: www.biorxiv.org

No comment yet.
Scooped by Complexity Digest
July 5, 2:50 PM
Scoop.it!

Infection patterns in simple and complex contagion processes on networks

Contreras DA, Cencetti G, Barrat A

PLoS Comput Biol 20(6): e1012206.

Contagion processes, representing the spread of infectious diseases, information, or social behaviors, are often schematized as taking place on networks, which encode for instance the interactions between individuals. We here observe how the network is explored by the contagion process, i.e. which links are used for contagions and how frequently. The resulting infection pattern depends on the chosen infection model but surprisingly not all the parameters and models features play a role in the infection pattern. We discover for instance that in simple contagion processes, where contagion events involve one connection at a time, the infection patterns are extremely robust across models and parameters. This has consequences in the role of models in decision-making, as it implies that numerical simulations of simple contagion processes using simplified settings can bring important insights even in the case of a new emerging disease whose properties are not yet well known. In complex contagion models instead, in which multiple interactions are needed for a contagion event, non-trivial dependencies on model parameters emerge and infection patterns cannot be confused with those observed for simple contagion.

Read the full article at: journals.plos.org

No comment yet.
Scooped by Complexity Digest
July 4, 2:48 PM
Scoop.it!

An Invitation to Universality in Physics, Computer Science, and Beyond

Tomáš Gonda, Gemma De les Coves

A universal Turing machine is a powerful concept - a single device can compute any function that is computable. A universal spin model, similarly, is a class of physical systems whose low energy behavior simulates that of any spin system. Our categorical framework for universality (arXiv:2307.06851) captures these and other examples of universality as instances. In this article, we present an accessible account thereof with a focus on its basic ingredients and ways to use it. Specifically, we show how to identify necessary conditions for universality, compare types of universality within each instance, and establish that universality and negation give rise to unreachability (such as uncomputability).

Read the full article at: arxiv.org

No comment yet.
Scooped by Complexity Digest
July 1, 8:38 AM
Scoop.it!

Evolving reservoir computers reveals bidirectional coupling between predictive power and emergent dynamics

Hanna M. Tolle, Andrea I Luppi, Anil K. Seth, Pedro A. M. Mediano

Biological neural networks can perform complex computations to predict their environment, far above the limited predictive capabilities of individual neurons. While conventional approaches to understanding these computations often focus on isolating the contributions of single neurons, here we argue that a deeper understanding requires considering emergent dynamics - dynamics that make the whole system "more than the sum of its parts". Specifically, we examine the relationship between prediction performance and emergence by leveraging recent quantitative metrics of emergence, derived from Partial Information Decomposition, and by modelling the prediction of environmental dynamics in a bio-inspired computational framework known as reservoir computing. Notably, we reveal a bidirectional coupling between prediction performance and emergence, which generalises across task environments and reservoir network topologies, and is recapitulated by three key results: 1) Optimising hyperparameters for performance enhances emergent dynamics, and vice versa; 2) Emergent dynamics represent a near sufficient criterion for prediction success in all task environments, and an almost necessary criterion in most environments; 3) Training reservoir computers on larger datasets results in stronger emergent dynamics, which contain task-relevant information crucial for performance. Overall, our study points to a pivotal role of emergence in facilitating environmental predictions in a bio-inspired computational architecture.

Read the full article at: arxiv.org

No comment yet.
Scooped by Complexity Digest
June 29, 1:54 PM
Scoop.it!

Computational Life: How Well-formed, Self-replicating Programs Emerge from Simple Interaction

Blaise Agüera y Arcas, Jyrki Alakuijala, James Evans, Ben Laurie, Alexander Mordvintsev, Eyvind Niklasson, Ettore Randazzo, Luca Versari

The fields of Origin of Life and Artificial Life both question what life is and how it emerges from a distinct set of "pre-life" dynamics. One common feature of most substrates where life emerges is a marked shift in dynamics when self-replication appears. While there are some hypotheses regarding how self-replicators arose in nature, we know very little about the general dynamics, computational principles, and necessary conditions for self-replicators to emerge. This is especially true on "computational substrates" where interactions involve logical, mathematical, or programming rules. In this paper we take a step towards understanding how self-replicators arise by studying several computational substrates based on various simple programming languages and machine instruction sets. We show that when random, non self-replicating programs are placed in an environment lacking any explicit fitness landscape, self-replicators tend to arise. We demonstrate how this occurs due to random interactions and self-modification, and can happen with and without background random mutations. We also show how increasingly complex dynamics continue to emerge following the rise of self-replicators. Finally, we show a counterexample of a minimalistic programming language where self-replicators are possible, but so far have not been observed to arise.

Read the full article at: arxiv.org

No comment yet.
Scooped by Complexity Digest
June 29, 10:14 AM
Scoop.it!

Beehive scale-free emergent dynamics

Beehive scale-free emergent dynamics | Papers | Scoop.it

Ivan Shpurov, Tom Froese & Dante R. Chialvo 

Scientific Reports volume 14, Article number: 13404 (2024)

It has been repeatedly reported that the collective dynamics of social insects exhibit universal emergent properties similar to other complex systems. In this note, we study a previously published data set in which the positions of thousands of honeybees in a hive are individually tracked over multiple days. The results show that the hive dynamics exhibit long-range spatial and temporal correlations in the occupancy density fluctuations, despite the characteristic short-range bees’ mutual interactions. The variations in the occupancy unveil a non-monotonic function between density and bees’ flow, reminiscent of the car traffic dynamic near a jamming transition at which the system performance is optimized to achieve the highest possible throughput. Overall, these results suggest that the beehive collective dynamics are self-adjusted towards a point near its optimal density.

Read the full article at: www.nature.com

No comment yet.
Scooped by Complexity Digest
June 28, 4:07 PM
Scoop.it!

Hidden citations obscure true impact in science

Hidden citations obscure true impact in science | Papers | Scoop.it

Xiangyi Meng, Onur Varol, Albert-László Barabási Author Notes

PNAS Nexus, Volume 3, Issue 5, May 2024, page 155,

References, the mechanism scientists rely on to signal previous knowledge, lately have turned into widely used and misused measures of scientific impact. Yet, when a discovery becomes common knowledge, citations suffer from obliteration by incorporation. This leads to the concept of hidden citation, representing a clear textual credit to a discovery without a reference to the publication embodying it. Here, we rely on unsupervised interpretable machine learning applied to the full text of each paper to systematically identify hidden citations. We find that for influential discoveries hidden citations outnumber citation counts, emerging regardless of publishing venue and discipline. We show that the prevalence of hidden citations is not driven by citation counts, but rather by the degree of the discourse on the topic within the text of the manuscripts, indicating that the more discussed is a discovery, the less visible it is to standard bibliometric analysis. Hidden citations indicate that bibliometric measures offer a limited perspective on quantifying the true impact of a discovery, raising the need to extract knowledge from the full text of the scientific corpus.

Read the full article at: academic.oup.com

No comment yet.
Scooped by Complexity Digest
June 28, 12:09 PM
Scoop.it!

Unveiling the reproduction number scaling in characterizing social contagion coverage

Xiangrong Wang, Hongru Hou, Dan Lu, Zongze Wu, Yamir Moreno

Chaos, Solitons & Fractals

Volume 185, August 2024, 115119

The spreading of diseases depends critically on the reproduction number, which gives the expected number of new cases produced by infectious individuals during their lifetime. Here we reveal a widespread power-law scaling relationship between the variance and the mean of the reproduction number across simple and complex contagion mechanisms on various network structures. This scaling relation is verified on an empirical scientific collaboration network and analytically studied using generating functions. Specifically, we explore the impact of the scaling law of the reproduction number on the expected size of cascades of contagions. We find that the mean cascade size can be inferred from the mean reproduction number, albeit with limitations in capturing spreading variations. Nonetheless, insights derived from the tail of the distribution of the reproduction number contribute to explaining cascade size variation and allow the distinction between simple and complex contagion mechanisms. Our study sheds light on the intricate dynamics of spreading processes and cascade sizes in social networks, offering valuable insights for managing contagion outbreaks and optimizing responses to emerging threats.

Read the full article at: www.sciencedirect.com

No comment yet.
Scooped by Complexity Digest
June 16, 10:51 AM
Scoop.it!

Experimental Measurement of Assembly Indices are Required to Determine The Threshold for Life

Sara I. Walker, Cole Mathis, Stuart Marshall, Leroy Cronin

Assembly Theory (AT) was developed to help distinguish living from non-living systems. The theory is simple as it posits that the amount of selection or Assembly is a function of the number of complex objects where their complexity can be objectively determined using assembly indices. The assembly index of a given object relates to the number of recursive joining operations required to build that object and can be not only rigorously defined mathematically but can be experimentally measured. In pervious work we outlined the theoretical basis, but also extensive experimental measurements that demonstrated the predictive power of AT. These measurements showed that is a threshold in assembly indices for organic molecules whereby abiotic chemical systems could not randomly produce molecules with an assembly index greater or equal than 15. In a recent paper by Hazen et al [1] the authors not only confused the concept of AT with the algorithms used to calculate assembly indices, but also attempted to falsify AT by calculating theoretical assembly indices for objects made from inorganic building blocks. A fundamental misunderstanding made by the authors is that the threshold is a requirement of the theory, rather than experimental observation. This means that exploration of inorganic assembly indices similarly requires an experimental observation, correlated with the theoretical calculations. Then and only then can the exploration of complex inorganic molecules be done using AT and the threshold for living systems, as expressed with such building blocks, be determined. Since Hazen et al.[1] present no experimental measurements of assembly theory, their analysis is not falsifiable.

Read the full article at: arxiv.org

No comment yet.
Scooped by Complexity Digest
June 14, 6:32 AM
Scoop.it!

Measuring Complexity using Information

Klaus Jaffe

Measuring complexity in multidimensional systems with high degrees of freedom and a variety of types of information, remains an important challenge. Complexity of a system is related to the number and variety of components, the number and type of interactions among them, the degree of redundancy, and the degrees of freedom of the system. Examples show that different disciplines of science converge in complexity measures for low and high dimensional problems. For low dimensional systems, such as coded strings of symbols (text, computer code, DNA, RNA, proteins, music), Shannon’s Information Entropy (expected amount of information in an event drawn from a given distribution) and Kolmogorov‘s Algorithmic Complexity (the length of the shortest algorithm that produces the object as output), are used for quantitative measurements of complexity. For systems with more dimensions (ecosystems, brains, social groupings), network science provides better tools for that purpose. For complex highly multidimensional systems, none of the former methods are useful. Useful Information Φ, as proposed by Infodynamics, can be related to complexity. It can be quantified by measuring the thermodynamic Free Energy F and/or useful Work it produces. Complexity measured as Total Information I, can then be defined as the information of the system, that includes Φ, useless information or Noise N, and Redundant Information R. Measuring one or more of these variables allows quantifying and classifying complexity.

Read the full article at: www.qeios.com

No comment yet.
Scooped by Complexity Digest
June 12, 12:18 PM
Scoop.it!

Irruption Theory in Phase Transitions: A Proof of Concept With the Haken-Kelso-Bunz Model

Javier Sánchez-Cañizares

Adaptive Behavior

Many theoretical studies defend the existence of ongoing phase transitions in the brain dynamics that could explain its enormous plasticity to cope with the environment. However, tackling the ever-changing landscapes of brain dynamics seems a hopeless task with complex models. This paper uses a simple Haken-Kelso-Bunz (HKB) model to illustrate how phase transitions that change the number of attractors in the landscape for the relative phase between two neural assemblies can occur, helping to explain a qualitative agreement with empirical decision-making measures. Additionally, the paper discusses the possibility of interpreting this agreement with the aid of Irruption Theory (IT). Being the effect of symmetry breakings and the emergence of non-linearities in the fundamental equations, the order parameter governing phase transitions may not have a complete microscopic determination. Hence, many requirements of IT, particularly the Participation Criterion, could be fulfilled by the HKB model and its extensions. Briefly stated, triggering phase transitions in the brain activity could thus be conceived of as a consequence of actual motivations or free will participating in decision-making processes.

Read the full article at: journals.sagepub.com

No comment yet.
Scooped by Complexity Digest
June 2, 11:27 AM
Scoop.it!

Is the Emergence of Life an Expected Phase Transition in the Evolving Universe?

Stuart Kauffman and Andrea Roli

We propose a novel definition of life in terms of which its emergence in the universe is expected, and its ever-creative open-ended evolution is entailed by no law. Living organisms are Kantian Wholes that achieve Catalytic Closure, Constraint Closure, and Spatial Closure. We here unite for the first time two established mathematical theories, namely Collectively Autocatalytic Sets and the Theory of the Adjacent Possible. The former establishes that a first-order phase transition to molecular reproduction is expected in the chemical evolution of the universe where the diversity and complexity of molecules increases; the latter posits that, under loose hypotheses, if the system starts with a small number of beginning molecules, each of which can combine with copies of itself or other molecules to make new molecules, over time the number of kinds of molecules increases slowly but then explodes upward hyperbolically. Together these theories imply that life is expected as a phase transition in the evolving universe. The familiar distinction between software and hardware loses its meaning in living cells. We propose new ways to study the phylogeny of metabolisms, new astronomical ways to search for life on exoplanets, new experiments to seek the emergence of the most rudimentary life, and the hint of a coherent testable pathway to prokaryotes with template replication and coding.

Read the full article at: osf.io

No comment yet.