Papers
115.2K views | +322 today
 Scooped by Complexity Digest onto Papers

Choosing democracy over Big Brother

Strongly variable, highly complex systems – such as societies – cannot be properly managed by planning, optimisation and top-down control.

Instead, societal decision making and economic production processes should be run in a much more complex, participatory way, much like the decentralised self-organisation principles that drive the economy and organisation of the internet.

One day, advanced collaboration platforms will allow anyone to set up projects with others to create their own products, for example with 3D printers, so it might be that classical companies and political parties as institutions will increasingly be replaced by project-based initiatives.

No comment yet.

Papers

Recent publications related to complex systems
 Scooped by Complexity Digest

Model versions and fast algorithms for network epidemiology

Network epidemiology has become a core framework for investigating the role of human contact patterns in the spreading of infectious diseases. In network epidemiology represents the contact structure as a network of nodes (individuals) connected by links (sometimes as a temporal network where the links are not continuously active) and the disease as a compartmental model (where individuals are assigned states with respect to the disease and follow certain transition rules between the states). In this paper, we discuss fast algorithms for such simulations and also compare two commonly used versions - one where there is a constant recovery rate (the number of individuals that stop being infectious per time is proportional to the number of such people), the other where the duration of the disease is constant. We find that, for most practical purposes, these versions are qualitatively the same.

Model versions and fast algorithms for network epidemiology
Petter Holme

http://arxiv.org/abs/1403.1011

No comment yet.
 Scooped by Complexity Digest

Predicting Scientific Success Based on Coauthorship Networks

We address the question to what extent the success of scientific articles is due to social influence. Analyzing a data set of over 100000 publications from the field of Computer Science, we study how centrality in the coauthorship network differs between authors who have highly cited papers and those who do not. We further show that a machine learning classifier, based only on coauthorship network centrality measures at time of publication, is able to predict with high precision whether an article will be highly cited five years after publication. By this we provide quantitative insight into the social dimension of scientific publishing - challenging the perception of citations as an objective, socially unbiased measure of scientific success.

Predicting Scientific Success Based on Coauthorship Networks
Emre Sarigöl, Rene Pfitzner, Ingo Scholtes, Antonios Garas, Frank Schweitzer

http://arxiv.org/abs/1402.7268

No comment yet.
 Scooped by Complexity Digest

Correlation of automorphism group size and topological properties with program-size complexity evaluations of graphs and complex networks

We show that numerical approximations of Kolmogorov complexity (K) of graphs and networks capture some group-theoretic and topological
properties of empirical networks, ranging from metabolic to social
networks, and of small synthetic networks that we have produced. That
K and the size of the group of automorphisms of a graph are correlated
opens up interesting connections to problems in computational
geometry, and thus connects several measures and concepts from
complexity science. We derive these results via two different
Kolmogorov complexity approximation methods applied to the adjacency
matrices of the graphs and networks. The methods used are the
traditional lossless compression approach to Kolmogorov complexity,
and a normalized version of a Block Decomposition Method (BDM) based
on algorithmic probability theory.

Correlation of automorphism group size and topological properties with
program-size complexity evaluations of graphs and complex networks
H. Zenil et al.
Physica A: Statistical Mechanics and its Applications, 2014
http://www.sciencedirect.com/science/article/pii/S0378437114001691

Preprint available: http://arxiv.org/abs/1306.0322

No comment yet.
 Scooped by Complexity Digest

Origin of Peer Influence in Social Networks

Social networks pervade our everyday lives: we interact, influence, and are influenced by our friends and acquaintances. With the advent of the World Wide Web, large amounts of data on social networks have become available, allowing the quantitative analysis of the distribution of information on them, including behavioral traits and fads. Recent studies of correlations among members of a social network, who exhibit the same trait, have shown that individuals influence not only their direct contacts but also friends’ friends, up to a network distance extending beyond their closest peers. Here, we show how such patterns of correlations between peers emerge in networked populations. We use standard models (yet reflecting intrinsically different mechanisms) of information spreading to argue that empirically observed patterns of correlation among peers emerge naturally from a wide range of dynamics, being essentially independent of the type of information, on how it spreads, and even on the class of underlying network that interconnects individuals. Finally, we show that the sparser and clustered the network, the more far reaching the influence of each individual will be.
DOI: http://dx.doi.org/10.1103/PhysRevLett.112.098702

Origin of Peer Influence in Social Networks
Phys. Rev. Lett. 112, 098702 – Published 6 March 2014
Flávio L. Pinheiro, Marta D. Santos, Francisco C. Santos, and Jorge M. Pacheco

Eli Levine's curator insight,

Indeed, we are all interconnected in very profound and subtle ways, whether we accept it or not.

This one's for the Libertarians and conservatives out there, who don't seem to think that their actions effect the other, or that the other can effect them, or that the actions done onto the other will effect the actions that are done onto them by the other.

Kind of like how they blame the poor for being angry at the rich, after the poor produced the wealth that engorges the rich.

Silly people....

 Scooped by Complexity Digest

China is luring back expatriates with generous incentives, causing many to weigh the pros and cons of returning.

• Quirin Schiermeier
Nature 507, 129–131 (06 March 2014) http://dx.doi.org/10.1038/nj7490-129a

Eli Levine's curator insight,

Terrific....

And what are we investing in academia?  A load of crap to "confirm" the results our elites WANT to have confirmed (versus the results that are actually present?)

What ARE we DOING?

 Scooped by Complexity Digest

The strength of ‘weak signals’

As information thunders through the digital economy, it’s easy to miss valuable “weak signals” often hidden amid the noise. Arising primarily from social media, they represent snippets—not streams—of information and can help companies to figure out what customers want and to spot looming industry and market disruptions before competitors do. Sometimes, companies notice them during data-analytics number-crunching exercises. Or employees who apply methods more akin to art than to science might spot them and then do some further number crunching to test anomalies they’re seeing or hypotheses the signals suggest. In any case, companies are just beginning to recognize and capture their value. Here are a few principles that companies can follow to grasp and harness the power of weak signals.

Eli Levine's curator insight,

The same can be said for governing, although the end goal is, when it's actually working for the sake of the governing, how to better serve people according to their needs and expressed desires.  The reward for good governance is continued time in office.  The way you actually get to that end is through a combination of listening for NEEDS (which aren't the same as wants) within the general public and then actively teasing those needs out so that you can understand them.

It's a pro-active dialogue, especially on the part of the governing, if it is being done in a way that is actually beneficial for the governing and the governed alike.  The former depends on the latter more than the latter depends on the former, because it is the governed which gives authority to the governing, while the governed can exist (if sub-optimally) without the governing group's present.  It doesn't even matter which specific group is in power, since they're all going to be bound to do the same basic stuff in the same basic ways, if they're going to produce optimal results for themselves and other people living in the society as a whole.  The only question that matters is "how well does the present governing group do at governing?"  Society is constantly open to shopping for other options; constantly playing the field if things become sub-optimal for society in some way, shape or form.

That is why a good government is proactive when working with its citizens and listening for these "weak signals", because those are what reveals the subtle workings of the group's psychology and what the group actually is needing/wanting versus what they explicitly express.

 Scooped by Complexity Digest

How to Save Human Lives with Complexity Science

We discuss models and data of crowd disasters, crime, terrorism, war and disease spreading to show that conventional recipes, such as deterrence strategies, are not effective and sufficient to contain them. The failure of many conventional approaches results from their neglection of feedback loops, instabilities and/or cascade effects, due to which equilibrium models do often not provide a good picture of the actual system behavior. However, the complex and often counter-intuitive behavior of social systems and their macro-level collective dynamics can be understood by means of complexity science, which enables one to address the aforementioned problems more successfully. We highlight that a suitable system design and management can help to stop undesirable cascade effects and to enable favorable kinds of self-organization in the system. In such a way, complexity science can help to save human lives.

How to Save Human Lives with Complexity Science
Dirk Helbing, Dirk Brockmann, Thomas Chadefaux, Karsten Donnay, Ulf Blanke, Olivia Woolley-Meza, Mehdi Moussaid, Anders Johansson, Jens Krause, Sebastian Schutte, Matjaz Perc

http://arxiv.org/abs/1402.7011

Eli Levine's curator insight,

This makes more intuitive sense than the linear-equilibrium stuff, in all honesty.  The more we know, the better we'll be at resolving these common problems.

Wolf Hesse's curator insight,

#activism

#scrape #prep

Liz Rykert's curator insight,

Here is the critical summary: "We highlight that a suitable system design and management can help to stop undesirable cascade effects and to enable favorable kinds of self-organization in the system. In such a way, complexity science can help to save human lives."

 Scooped by Complexity Digest

Combining Experiments and Simulations Using the Maximum Entropy Principle

A key component of computational biology is to compare the results of computer modelling with experimental measurements. Despite substantial progress in the models and algorithms used in many areas of computational biology, such comparisons sometimes reveal that the computations are not in quantitative agreement with experimental data. The principle of maximum entropy is a general procedure for constructing probability distributions in the light of new data, making it a natural tool in cases when an initial model provides results that are at odds with experiments. The number of maximum entropy applications in our field has grown steadily in recent years, in areas as diverse as sequence analysis, structural modelling, and neurobiology. In this Perspectives article, we give a broad introduction to the method, in an attempt to encourage its further adoption. The general procedure is explained in the context of a simple example, after which we proceed with a real-world application in the field of molecular simulations, where the maximum entropy procedure has recently provided new insight. Given the limited accuracy of force fields, macromolecular simulations sometimes produce results that are at not in complete and quantitative accordance with experiments. A common solution to this problem is to explicitly ensure agreement between the two by perturbing the potential energy function towards the experimental data. So far, a general consensus for how such perturbations should be implemented has been lacking. Three very recent papers have explored this problem using the maximum entropy approach, providing both new theoretical and practical insights to the problem. We highlight each of these contributions in turn and conclude with a discussion on remaining challenges.

Boomsma W, Ferkinghoff-Borg J, Lindorff-Larsen K (2014) Combining Experiments and Simulations Using the Maximum Entropy Principle. PLoS Comput Biol 10(2): e1003406. http://dx.doi.org/10.1371/journal.pcbi.1003406

No comment yet.
 Scooped by Complexity Digest

The Robustness and Evolvability of Transcription Factor Binding Sites

Robustness, the maintenance of a character in the presence of genetic change, can help preserve adaptive traits but also may hinder evolvability, the ability to bring forth novel adaptations. We used genotype networks to analyze the binding site repertoires of 193 transcription factors from mice and yeast, providing empirical evidence that robustness and evolvability need not be conflicting properties. Network vertices represent binding sites where two sites are connected if they differ in a single nucleotide. We show that the binding sites of larger genotype networks are not only more robust, but the sequences adjacent to such networks can also bind more transcription factors, thus demonstrating that robustness can facilitate evolvability.

The Robustness and Evolvability of Transcription Factor Binding Sites
Joshua L. Payne, Andreas Wagner

Science 21 February 2014:
Vol. 343 no. 6173 pp. 875-877
http://dx.doi.org/10.1126/science.1249046

No comment yet.
 Scooped by Complexity Digest

Predicting Crowd Behavior with Big Public Data

With public information becoming widely accessible and shared on today's web, greater insights are possible into crowd actions by citizens and non-state actors such as large protests and cyber activism. We present efforts to predict the occurrence, specific timeframe, and location of such actions before they occur based on public data collected from over 300,000 open content web sources in 7 languages, from all over the world, ranging from mainstream news to government publications to blogs and social media. Using natural language processing, event information is extracted from content such as type of event, what entities are involved and in what role, sentiment and tone, and the occurrence time range of the event discussed. Statements made on Twitter about a future date from the time of posting prove particularly indicative. We consider in particular the case of the 2013 Egyptian coup d'etat. The study validates and quantifies the common intuition that data on social media (beyond mainstream news sources) are able to predict major events.

Predicting Crowd Behavior with Big Public Data
Nathan Kallus

http://arxiv.org/abs/1402.2308

António F Fonseca's curator insight,

Its becoming standard practice.

 Suggested by Joseph Lizier

Damage spreading in spatial and small-world random Boolean networks

The study of the response of complex dynamical social, biological, or technological networks to external perturbations has numerous applications. Random Boolean networks (RBNs) are commonly used as a simple generic model for certain dynamics of complex systems. Traditionally, RBNs are interconnected randomly and without considering any spatial extension and arrangement of the links and nodes. However, most real-world networks are spatially extended and arranged with regular, power-law, small-world, or other nonrandom connections. Here we explore the RBN network topology between extreme local connections, random small-world, and pure random networks, and study the damage spreading with small perturbations. We find that spatially local connections change the scaling of the Hamming distance at very low connectivities ($\bar{K} << 1$) and that the critical connectivity of stability $\bar{K}$ changes compared to random networks. At higher $\bar{K}$, this scaling remains unchanged. We also show that the Hamming distance of spatially local networks scales with a power law as the system size $N$ increases, but with a different exponent for local and small-world networks. The scaling arguments for small-world networks are obtained with respect to the system sizes and strength of spatially local connections. We further investigate the wiring cost of the networks. From an engineering perspective, our new findings provide the key design trade-offs between damage spreading (robustness), the network's wiring cost, and the network's communication characteristics.

Qiming Lu and Christof Teuscher
Damage spreading in spatial and small-world random Boolean networks
Phys. Rev. E 89, 022806 (2014)

http://pre.aps.org/abstract/PRE/v89/i2/e022806

No comment yet.
 Scooped by Complexity Digest

The complex architecture of primes and natural numbers

Natural numbers can be divided in two non-overlapping infinite sets, primes and composites, with composites factorizing into primes. Despite their apparent simplicity, the elucidation of the architecture of natural numbers with primes as building blocks remains elusive. Here, we propose a new approach to decoding the architecture of natural numbers based on complex networks and stochastic processes theory. We introduce a parameter-free non-Markovian dynamical model that naturally generates random primes and their relation with composite numbers with remarkable accuracy. Our model satisfies the prime number theorem as an emerging property and a refined version of Cram\'er's conjecture about the statistics of gaps between consecutive primes that seems closer to reality than the original Cram\'er's version. Regarding composites, the model helps us to derive the prime factors counting function, giving the probability of distinct prime factors for any integer. Probabilistic models like ours can help not only to conjecture but also to prove results about primes and the complex architecture of natural numbers.

The complex architecture of primes and natural numbers
Guillermo Garcia-Perez, M. Angeles Serrano, Marian Boguna

http://arxiv.org/abs/1402.3612

No comment yet.
 Scooped by Complexity Digest

Designing Collective Behavior in a Termite-Inspired Robot Construction Team

Complex systems are characterized by many independent components whose low-level actions produce collective high-level results. Predicting high-level results given low-level rules is a key open challenge; the inverse problem, finding low-level rules that give specific outcomes, is in general still less understood. We present a multi-agent construction system inspired by mound-building termites, solving such an inverse problem. A user specifies a desired structure, and the system automatically generates low-level rules for independent climbing robots that guarantee production of that structure. Robots use only local sensing and coordinate their activity via the shared environment. We demonstrate the approach via a physical realization with three autonomous climbing robots limited to onboard sensing. This work advances the aim of engineering complex systems that achieve specific human-designed goals.

Designing Collective Behavior in a Termite-Inspired Robot Construction Team
Justin Werfel, Kirstin Petersen, Radhika Nagpal

Science 14 February 2014:
Vol. 343 no. 6172 pp. 754-758
http://dx.doi.org/10.1126/science.1245842

No comment yet.
 Scooped by Complexity Digest

Netconomics: Novel Forecasting Techniques from the Combination of Big Data, Network Science and Economics

The combination of the network theoretic approach with recently available abundant economic data leads to the development of novel analytic and computational tools for modelling and forecasting key economic indicators. The main idea is to introduce a topological component into the analysis, taking into account consistently all higher-order interactions. We present three basic methodologies to demonstrate different approaches to harness the resulting network gain. First, a multiple linear regression optimisation algorithm is used to generate a relational network between individual components of national balance of payment accounts. This model describes annual statistics with a high accuracy and delivers good forecasts for the majority of indicators. Second, an early-warning mechanism for global financial crises is presented, which combines network measures with standard economic indicators. From the analysis of the cross-border portfolio investment network of long-term debt securities, the proliferation of a wide range of over-the-counter-traded financial derivative products, such as credit default swaps, can be described in terms of gross-market values and notional outstanding amounts, which are associated with increased levels of market interdependence and systemic risk. Third, considering the flow-network of goods traded between G-20 economies, network statistics provide better proxies for key economic measures than conventional indicators. For example, it is shown that a country's gate-keeping potential, as a measure for local power, projects its annual change of GDP generally far better than the volume of its imports or exports.

Netconomics: Novel Forecasting Techniques from the Combination of Big Data, Network Science and Economics
Andreas Joseph, Irena Vodenska, Eugene Stanley, Guanrong Chen

http://arxiv.org/abs/1403.0848

No comment yet.
 Scooped by Complexity Digest

CTL update of Kripke models through protections

We present a nondeterministic, recursive algorithm for updating a Kripke model so as to satisfy a given formula of computation-tree logic (CTL). Recursive algorithms for model update face two dual difficulties: (1) Removing transitions from a Kripke model to satisfy a universal subformula may dissatisfy some existential subformulas. Conversely, (2) adding transitions to satisfy an existential subformula may dissatisfy some universal subformulas. To overcome these difficulties, we employ protections of the form 〈E,A,L〉, recording information about the satisfaction of subformulas previously treated by the algorithm. Intuitively, (1) E is the set of transitions that we cannot remove without compromising the satisfaction of previously treated subformulas. Conversely, (2) A is the set of transitions that we can add. Hence, update proceeds without diminishing E and without augmenting A. Finally, (3) L is a set of literals protecting the model labels. We illustrate our algorithm through several examples: Emerson and Clarke's mutual-exclusion problem, Clarke's microwave-oven example, synchronous counters, and randomly generated models and formulas. In addition, we compare our method with other update approaches for either CTL or fragments of CTL. Lastly, we provide proofs of soundness and completeness and a complexity analysis.

CTL update of Kripke models through protections ☆
Miguel Carrillo, David A. Rosenblueth

Artificial Intelligence, In Press

No comment yet.
 Scooped by Complexity Digest

A Genomic Road Map for Complex Human Disease

Despite the successes of genome-wide association studies (GWAS) in identifying genetic connections with human disease, it has become clear that interpreting these data requires a clear understanding of how these new risk genes are regulated. On pages 1118 and 1119 of this issue, Fairfax et al. (1) and Lee et al. (2), respectively, elucidate networks of genetic regulation in the context of the human innate immune system and show how this information can be directly applied to understanding the genetics of autoimmune disorders.

A Genomic Road Map for Complex Human Disease
Peter K. Gregersen

Science 7 March 2014:
Vol. 343 no. 6175 pp. 1087-1088
http://dx.doi.org/10.1126/science.1251426

No comment yet.
 Scooped by Complexity Digest

Spatially Distributed Social Complex Networks

We propose a bare-bones stochastic model that takes into account both the geographical distribution of people within a country and their complex network of connections. The model, which is designed to give rise to a scale-free network of social connections and to visually resemble the geographical spread seen in satellite pictures of the Earth at night, gives rise to a power-law distribution for the ranking of cities by population size (but for the largest cities) and reflects the notion that highly connected individuals tend to live in highly populated areas. It also yields some interesting insights regarding Gibrat’s law for the rates of city growth (by population size), in partial support of the findings in a recent analysis of real data [Rozenfeld et al., Proc. Natl. Acad. Sci. U.S.A. 105 18702 (2008)]. The model produces a nontrivial relation between city population and city population density and a superlinear relationship between social connectivity and city population, both of which seem quite in line with real data.
DOI: http://dx.doi.org/10.1103/PhysRevX.4.011008

Spatially Distributed Social Complex Networks
Phys. Rev. X 4, 011008 – Published 28 January 2014
Gerald F. Frasco, Jie Sun, Hernán D. Rozenfeld, and Daniel ben-Avraham

No comment yet.
 Suggested by Segismundo

Inclusive fitness maximization: An axiomatic approach

[...] Our results help integrate evolutionary theory and rational choice theory, help draw out the behavioural implications of inclusive fitness maximization, and point to a possible way in which evolution could lead organisms to implement it.

No comment yet.
 Scooped by Complexity Digest

The Relative Ineffectiveness of Criminal Network Disruption

Researchers, policymakers and law enforcement agencies across the globe struggle to find effective strategies to control criminal networks. The effectiveness of disruption strategies is known to depend on both network topology and network resilience. However, as these criminal networks operate in secrecy, data-driven knowledge concerning the effectiveness of different criminal network disruption strategies is very limited. By combining computational modeling and social network analysis with unique criminal network intelligence data from the Dutch Police, we discovered, in contrast to common belief, that criminal networks might even become ‘stronger’, after targeted attacks. On the other hand increased efficiency within criminal networks decreases its internal security, thus offering opportunities for law enforcement agencies to target these networks more deliberately. Our results emphasize the importance of criminal network interventions at an early stage, before the network gets a chance to (re-)organize to maximum resilience. In the end disruption strategies force criminal networks to become more exposed, which causes successful network disruption to become a long-term effort.

The Relative Ineffectiveness of Criminal Network Disruption
Paul A. C. Duijn, Victor Kashirin & Peter M. A. Sloot

Scientific Reports 4, Article number: 4238 http://dx.doi.org/10.1038/srep04238

Eli Levine's curator insight,

My only critique of this, is that even by successfully disrupting the social networks, you will ont get rid of the foundations of crime within a society.

Greed, lust, violence, all of these things come from the brain and can be seen as mental health problems, rather than necessarily just societal problems.  I think we've got to begin ori sorting th the convected and post conicted crowd, such tht we can understand how their brains work and then, how to help heal them, such that we eliminate criminality and crime inspited lifestyles.  I understand there are dozens of easy ways to be opposed to this and that there are dozes more ways th work (especially here, in america, where we are soc focused on our small "selves" to forget that there is a much much much much larger world out thre, and that of ourselves as well.  We are connected to everyone and everything.  That's science.  To deny that it is otherwise is to invite delusion and hallucinations about reality and to invite other problems into your life and the rest of ours for your deliberate ignorance and unwillingness to escape to where reality simply is unoffensive and not politically motivated other than to help other people.

Therefore, let's overcome this monkey need to punish people for crimes they really didn't have much say in (thankst o the primacy of the brain) and start doing some research on these people (even though they should be confined from the rest of the population until treatments and diagnoses have been developed and concluded upon).

 Scooped by Complexity Digest

Information Evolution in Social Networks

Social networks readily transmit information, albeit with less than perfect fidelity. We present a large-scale measurement of this imperfect information copying mechanism by examining the dissemination and evolution of thousands of memes, collectively replicated hundreds of millions of times in the online social network Facebook. The information undergoes an evolutionary process that exhibits several regularities. A meme's mutation rate characterizes the population distribution of its variants, in accordance with the Yule process. Variants further apart in the diffusion cascade have greater edit distance, as would be expected in an iterative, imperfect replication process. Some text sequences can confer a replicative advantage; these sequences are abundant and transfer "laterally" between different memes. Subpopulations of the social network can preferentially transmit a specific variant of a meme if the variant matches their beliefs or culture. Understanding the mechanism driving change in diffusing information has important implications for how we interpret and harness the information that reaches us through our social networks.

Information Evolution in Social Networks

http://arxiv.org/abs/1402.6792

António F Fonseca's curator insight,

Memes are the information science counterpath of particles to physics.

 Scooped by Complexity Digest

Crowd-sourcing: Strength in numbers

Researchers are finding that online, crowd-sourced collaboration can speed up their work — if they choose the right problem.

http://www.nature.com/news/crowd-sourcing-strength-in-numbers-1.14757

No comment yet.
 Scooped by Complexity Digest

Controlling Chimeras

Coupled phase oscillators model a variety of dynamical phenomena in nature and technological applications. A curious feature of non-locally coupled phase oscillator is the emergence of chimera states. These states are characterized by localized phase synchrony while the remaining oscillators move incoherently. Here we apply the idea of control to chimera states; through a new dynamic control scheme that exploits drift, a chimera will attain any desired target position. Our control approach extends beyond chimera states as it may also be used to optimize more general objective functions.

Controlling Chimeras
Christian Bick, Erik A. Martens

http://arxiv.org/abs/1402.6363

No comment yet.
 Scooped by Complexity Digest

Autonomous drones flock like birds

A Hungarian team has created the first drones that can fly as a coordinated flock. The researchers watched as the ten autonomous robots took to the air in a field outside Budapest, zipping through the open sky, flying in formation or even following a leader, all without any central control.

Autonomous drones flock like birds
Ed Yong

Nature doi:10.1038/nature.2014.14776

http://www.nature.com/news/autonomous-drones-flock-like-birds-1.14776

Keith Hamon's curator insight,

I think flocking as an educational strategy deserves more study. Can a flock of birds find their way home better than a single bird? I'll bet they can, but how do they do it? How do they coordinate their knowledge and behavior?

 Scooped by Complexity Digest

Controlling Chaos Faster

Predictive Feedback Control is an easy-to-implement method to stabilize unknown unstable periodic orbits in chaotic dynamical systems. Predictive Feedback Control is severely limited because asymptotic convergence speed decreases with stronger instabilities which in turn are typical for larger target periods, rendering it harder to effectively stabilize periodic orbits of large period. Here, we study stalled chaos control, where the application of control is stalled to make use of the chaotic, uncontrolled dynamics, and introduce an adaptation paradigm to overcome this limitation and speed up convergence. This modified control scheme is not only capable of stabilizing more periodic orbits than the original Predictive Feedback Control but also speeds up convergence for typical chaotic maps, as illustrated in both theory and application. The proposed adaptation scheme provides a way to tune parameters online, yielding a broadly applicable, fast chaos control that converges reliably, even for periodic orbits of large period.

Controlling Chaos Faster
Christian Bick, Christoph Kolodziejski, Marc Timme

http://arxiv.org/abs/1402.4763

No comment yet.
 Scooped by Complexity Digest

Being First Is Best: An Adventure Capitalist’s Approach to Life and Investing, A Conversation with Dean LeBaron

A self-described “adventure capitalist,”Dean LeBaron is the founder andformer chairman of BatterymarchFinancial Management, recognized as oneof the most innovative investment manage-ment firms in the industry. As an “investmentfuturist,” LeBaron was one of the first to see thepotential of quantitative investing, implement-ing computer-driven technology and modelingtechniques at Batterymarch to systematically analyze data, trade, and manage investment portfolios. Under LeBaron’s leadership, Batterymarch pioneered indexing as an investment strategy. An early adopter of a contrarian philosophy, LeBaron followed his own advice that “in the investment field, you should be where everyone else is not,” leading Batterymarch to become one of the earliest (or first) institutional investors in the emerging markets of China, India, and Latin America. His interest and work in Russia resulted from an invitation from the government of President Mikhail Gorbachev to help priva- tize the Soviet Union’s military industrial complex. With more than five decades of experience as an investment manager, LeBaron often has been the right man in the right place at the right time, following his maxim that “if the choice is limited to being best or being first, being first is often best.”http://www.imca.org/sites/default/files/current-issues/JIC/JIC142_MastersSeriesLebaron.pdf
Complexity Digest's insight:
Dean Le Baron conceived and sponsored Complexity Digest since 1999. http://www.deanlebaron.com
Eli Levine's curator insight,

It is the initial phases which begin the journey of a thousand steps in the long arc.

Too bad that it's so hard to change course when it was misfired in an incorrect or half-assed direction.

How to change that?