Your new post is loading...
Your new post is loading...
Systems Biology is a young and rapidly evolving research field, which combines experimental techniques and mathematical modeling in order to achieve a mechanistic understanding of processes underlying the regulation and evolution of living systems. Systems Biology is often associated with an Engineering approach: The purpose is to formulate a datarich, detailed simulation model that allows to perform numerical (‘in silico’) experiments and then draw conclusions about the biological system. While methods from Engineering may be an appropriate approach to extending the scope of biological investigations to experimentally inaccessible realms and to supporting datarich experimental work, it may not be the best strategy in a search for design principles of biological systems and the fundamental laws underlying Biology. Physics has a long tradition of characterizing and understanding emergent collective behaviors in systems of interacting units and searching for universal laws. Therefore, it is natural that many concepts used in Systems Biology have their roots in Physics. With an emphasis on Theoretical Physics, we will here review the ‘Physics core’ of Systems Biology, show how some success stories in Systems Biology can be traced back to concepts developed in Physics, and discuss how Systems Biology can further benefit from its Theoretical Physics foundation.
An autonomous agent is something that can both reproduce itself and do at least one thermodynamic work cycle. It turns out that this is true of all freeliving cells, excepting weird special cases. They all do work cycles, just like the bacterium spinning its flagellum as it swims up the glucose gradient. The cells in your body are busy doing work cycles all the time.
T. Bossomaier, L. Barnett, M. Harré, J.T. Lizier "An Introduction to Transfer Entropy: Information Flow in Complex Systems" Springer, 2016.
This book considers a relatively new measure in complex systems, transfer entropy, derived from a series of measurements, usually a time series. After a qualitative introduction and a chapter that explains the key ideas from statistics required to understand the text, the authors then present information theory and transfer entropy in depth. A key feature of the approach is the authors' work to show the relationship between information flow and complexity. The later chapters demonstrate information transfer in canonical systems, and applications, for example in neuroscience and in finance. The book will be of value to advanced undergraduate and graduate students and researchers in the areas of computer science, neuroscience, physics, and engineering. SpringerLink access to PDFs: http://bit.ly/tebook2016 Springer hard copy listing: http://bit.ly/tebook2016hardcopy Amazon listing: http://amzn.to/2f5YdYW
Via Complexity Digest
The emergence in the United States of largescale “megaregions” centered on major metropolitan areas is a phenomenon often taken for granted in both scholarly studies and popular accounts of contemporary economic geography. This paper uses a data set of more than 4,000,000 commuter flows as the basis for an empirical approach to the identification of such megaregions. We compare a method which uses a visual heuristic for understanding areal aggregation to a method which uses a computational partitioning algorithm, and we reflect upon the strengths and limitations of both. We discuss how choices about input parameters and scale of analysis can lead to different results, and stress the importance of comparing computational results with “common sense” interpretations of geographic coherence. The results provide a new perspective on the functional economic geography of the United States from a megaregion perspective, and shed light on the old geographic problem of the division of space into areal units. Dash Nelson G, Rae A (2016) An Economic Geography of the United States: From Commutes to Megaregions. PLoS ONE 11(11): e0166083. doi:10.1371/journal.pone.0166083
Via Complexity Digest
Sharing rides could drastically improve the efficiency of car and taxi transportation. Unleashing such potential, however, requires understanding how urban parameters affect the fraction of individual trips that can be shared, a quantity that we call shareability. Using data on millions of taxi trips in New York City, San Francisco, Singapore, and Vienna, we compute the shareability curves for each city, and find that a natural rescaling collapses them onto a single, universal curve. We explain this scaling law theoretically with a simple model that predicts the potential for ride sharing in any city, using a few basic urban quantities and no adjustable parameters. Accurate extrapolations of this type will help planners, transportation companies, and society at large to shape a sustainable path for urban growth. Scaling Law of Urban Ride Sharing Remi Tachet, Oleguer Sagarra, Paolo Santi, Giovanni Resta, Michael Szell, Steven Strogatz, Carlo Ratti
Via Complexity Digest
Complexity science concepts of emergence, selforganization, and feedback suggest that descriptions of systems and events are subjective, incomplete, and impermanentsimilar to what we observe in quantum phenomena. Complexity science evinces an increasingly compelling alternative to reductionism for describing physical phenomena, now that shared aspects of complexity science and quantum phenomena are being scientifically substantiated. Establishment of a clear connection between chaotic complexity and quantum entanglement in small quantum systems indicates the presence of common processes involved in thermalization in large and smallscale systems. Recent findings in the fields of quantum physics, quantum biology, and quantum cognition demonstrate evidence of the complexity science characteristics of sensitivity to initial conditions and emergence of selforganizing systems. Efficiencies in quantum superposition suggest a new paradigm in which our very notion of complexity depends on which information theory we choose to employ. Evidence of Shared Aspects of Complexity Science and Quantum Phenomena Cynthia Larson Cosmos and History: The Journal of Natural and Social Philosophy, Vol 12, No 2 (2016)
Via Complexity Digest
To analyze the reliability of a complex system described by minimal paths, an empirical likelihood method is proposed to solve the reliability test problem when the subsystem distributions are unknown. Furthermore, we provide a reliability test statistic of the complex system and extract the limit distribution of the test statistic. Therefore, we can obtain the confidence interval for reliability and make statistical inferences. The simulation studies also demonstrate the theorem results.
A reflection of our ultimate understanding of a complex system is our ability to control its behavior. Typically, control has multiple prerequisites: it requires an accurate map of the network that governs the interactions between the system’s components, a quantitative description of the dynamical laws that govern the temporal behavior of each component, and an ability to influence the state and temporal behavior of a selected subset of the components. With deep roots in dynamical systems and control theory, notions of control and controllability have taken a new life recently in the study of complex networks, inspiring several fundamental questions: What are the control principles of complex systems? How do networks organize themselves to balance control with functionality? To address these questions here recent advances on the controllability and the control of complex networks are reviewed, exploring the intricate interplay between the network topology and dynamical laws. The pertinent mathematical results are matched with empirical findings and applications. Uncovering the control principles of complex systems can help us explore and ultimately understand the fundamental laws that govern their behavior. Control principles of complex systems YangYu Liu and AlbertLászló Barabási Rev. Mod. Phys. 88, 035006
Via Complexity Digest
We explore the behaviour of an ensemble of chaotic oscillators coupled only to an external chaotic system, whose intrinsic dynamics may be similar or dissimilar to the group. Counterintuitively, we find that a dissimilar external system manages to suppress the intrinsic chaos of the oscillators to fixed point dynamics, at sufficiently high coupling strengths. So, while synchronization is induced readily by coupling to an identical external system, control to fixed states is achieved only if the external system is dissimilar. We quantify the efficacy of control by estimating the fraction of random initial states that go to fixed points, a measure analogous to basin stability. Lastly, we indicate the generality of this phenomenon by demonstrating suppression of chaotic oscillations by coupling to a common hyperchaotic system. These results then indicate the easy controllability of chaotic oscillators by an external chaotic system, thereby suggesting a potent method that may help design control strategies.
So will we ever be able to model something as complex as the human brain using computers? After all, biological systems use symmetry and interaction to do things that even the most powerful computers cannot do – like surviving, adapting and reproducing. This is one reason why binary logic often falls short of describing how living things or human intelligence work. But our new research suggests there are alternatives: by using the mathematics that describe biological networks in the computers of the future, we may be able to make them more complex and similar to living systems like the brain. How the hidden mathematics of living cells could help us decipher the brain Chrystopher Nehaniv https://theconversation.com/howthehiddenmathematicsoflivingcellscouldhelpusdecipherthebrain59483
Via Complexity Digest
How to get from a 'problematic situation' to a 'systemic intervention'? While reading '15 praktijkverhalen over kennismanagement' [Dutch for '15 practical cases of knowledge management'] I came across one story (about Kennisland, Dutch for 'knowledgeland') which triggered my curiosity. It led me to MaRS (originally 'Medical and Related Sciences', but now an acronym no more),…
Springing from my recent post distinguishing types of interdisciplinary research, I now will go into more detail on a related topic: the difference between studying particular systems that happen to be complex, and studying complexity itself. The main point is that complexity theory includes several commitments related to levels of organization and to there being shared principles/mechanisms underpinning the dynamics of disparate systems. Studying complexity is the overt researching of these commitments and underpinnings. However, most scientists that describe themselves as doing complexity research are not doing that. Instead they are studying particular complex systems and typically ignore the commitments and underpinnings that define complexity science.

Weaver differentiates “disorganized complexity”, and “organized complexity”.
A complete concise understanding of the systems approach When I started this blog (CSL4D, i.e. Concept & Systems Learning for Design) almost 5 years ago (January 8, 2012), I had just discovered concept mapping as a great learning tool. At the same time I had a great interest in systems thinking, but found it hard…
Nearly all nontrivial realworld systems are nonlinear dynamical systems. Chaos describes certain nonlinear dynamical systems that have a very sensitive dependence on initial conditions. Chaotic systems are always deterministic and may be very simple, yet they produce completely unpredictable and divergent behavior. Systems of nonlinear equations are difficult to solve analytically, and scientists have relied heavily on visual and qualitative approaches to discover and analyze the dynamics of nonlinearity. Indeed, few fields have drawn as heavily from visualization methods for their seminal innovations: from strange attractors, to bifurcation diagrams, to cobweb plots, to phase diagrams and embedding. Although the social sciences are increasingly studying these types of systems, seminal concepts remain murky or loosely adopted. This article has three aims. First, it argues for several visualization methods to critically analyze and understand the behavior of nonlinear dynamical systems. Second, it uses these visualizations to introduce the foundations of nonlinear dynamics, chaos, fractals, selfsimilarity and the limits of prediction. Finally, it presents Pynamical, an opensource Python package to easily visualize and explore nonlinear dynamical systems’ behavior. Visual Analysis of Nonlinear Dynamical Systems: Chaos, Fractals, SelfSimilarity and the Limits of Prediction Geoff Boeing Systems 2016, 4(4), 37; doi:10.3390/systems4040037
Via Complexity Digest
Jay Forrester, one of the great minds of the 20th century, died at 98, a few days ago. His career was long and fruitful, and we can say that his work changed the intellectual story of humankind in various ways, in particular for the role he had in the birth of the Club of Rome's report "The Limits to Growth"
Complex systems are characterized by specific timedependent interactions among their many constituents. As a consequence they often manifest rich, nontrivial and unexpected behavior. Examples arise both in the physical and nonphysical world. The study of complex systems forms a new interdisciplinary research area that cuts across physics, biology, ecology, economics, sociology, and the humanities. In this paper we review the essence of complex systems from a physicist's point of view, and try to clarify what makes them conceptually different from systems that are traditionally studied in physics. Our goal is to demonstrate how the dynamics of such systems may be conceptualized in quantitative and predictive terms by extending notions from statistical physics and how they can often be captured in a framework of coevolving multiplex network structures. We mention three areas of complexsystems science that are currently studied extensively, the science of cities, dynamics of societies, and the representation of texts as evolutionary objects. We discuss why these areas form complex systems in the above sense. We argue that there exists plenty of new land for physicists to explore and that methodical and conceptual progress is needed most. Complex systems: physics beyond physics Yurij Holovatch, Ralph Kenna, Stefan Thurner
Via Complexity Digest
How did human societies evolve from small groups, integrated by facetoface cooperation, to huge anonymous societies of today, typically organized as states? Why is there so much variation in the ability of different human populations to construct viable states? Existing theories are usually formulated as verbal models and, as a result, do not yield sharply defined, quantitative predictions that could be unambiguously tested with data. Here we develop a cultural evolutionary model that predicts where and when the largestscale complex societies arose in human history. The central premise of the model, which we test, is that costly institutions that enabled large human groups to function without splitting up evolved as a result of intense competition between societies—primarily warfare. Warfare intensity, in turn, depended on the spread of historically attested military technologies (e.g., chariots and cavalry) and on geographic factors (e.g., rugged landscape). The model was simulated within a realistic landscape of the Afroeurasian landmass and its predictions were tested against a large dataset documenting the spatiotemporal distribution of historical largescale societies in Afroeurasia between 1,500 BCE and 1,500 CE. The modelpredicted pattern of spread of largescale societies was very similar to the observed one. Overall, the model explained 65% of variance in the data. An alternative model, omitting the effect of diffusing military technologies, explained only 16% of variance. Our results support theories that emphasize the role of institutions in statebuilding and suggest a possible explanation why a long history of statehood is positively correlated with political stability, institutional quality, and income per capita.
Yesterday Blake Pollard and I drove to Metron's branch in San Diego. For the first time, I met four of the main project participants: John Foley (math), Thy Tran (programming), Tom Mifflin and Chris Boner (two higherups involved in the project). Jeff Monroe and Tiffany Change give us a briefing on Metron's ExAMS software. This…
In 10 slides I will explain above concept map which enables an integrated conceptualization of the logical relationships of the core characteristics of wicked problems with the basic requirements and workings of the systems approach.
About the Course: This course will explore how to use agentbased modeling to understand and examine a widely diverse and disparate set of complex problems. During the course, we will explore why agentbased modeling is a powerful new way to understand complex systems, what kinds of systems are amenable to complex systems analysis, and how agentbased modeling has been used in the past to study everything from economics to biology to political science to business and management. We will also teach you how to build a model from the ground up and how to analyze and understand the results of a model using the NetLogo programming language. We will also discuss how to build models that are sound and rigorous. No programming background or knowledge is required, and the methods examined will be useable in any number of different fields.....
Via Jürgen Kanz
The hypothesis that living systems can benefit from operating at the vicinity of critical points has gained momentum in recent years. Criticality may confer an optimal balance between too ordered and exceedingly noisy states. Here we present a model, based on information theory and statistical mechanics, illustrating how and why a community of agents aimed at understanding and communicating with each other converges to a globally coherent state in which all individuals are close to an internal critical state, i.e. at the borderline between order and disorder. We study—both analytically and computationally—the circumstances under which criticality is the best possible outcome of the dynamical process, confirming the convergence to critical points under very generic conditions. Finally, we analyze the effect of cooperation (agents trying to enhance not only their fitness, but also that of other individuals) and competition (agents trying to improve their own fitness and to diminish those of competitors) within our setting. The conclusion is that, while competition fosters criticality, cooperation hinders it and can lead to more ordered or more disordered consensual outcomes.
Via Samir, Complexity Digest
While I've focused this week thus far on Cities and the Wealth of Nations, Jane Jacobs' most popular book among planners is, of course, The Death and Life of Great American Cities. This is because the latter book contains all the of the happy things
