Edgar Analytics &...
Follow
Find
305 views | +0 today
 
Rescooped by Nuno Edgar Fernandes from Papers
onto Edgar Analytics & Complex Systems
Scoop.it!

A Multi-Level Geographical Study of Italian Political Elections from Twitter Data

A Multi-Level Geographical Study of Italian Political Elections from Twitter Data | Edgar Analytics & Complex Systems | Scoop.it

In this paper we present an analysis of the behavior of Italian Twitter users during national political elections. We monitor the volumes of the tweets related to the leaders of the various political parties and we compare them to the elections results. Furthermore, we study the topics that are associated with the co-occurrence of two politicians in the same tweet. We cannot conclude, from a simple statistical analysis of tweet volume and their time evolution, that it is possible to precisely predict the election outcome (or at least not in our case of study that was characterized by a “too-close-to-call” scenario). On the other hand, we found that the volume of tweets and their change in time provide a very good proxy of the final results. We present this analysis both at a national level and at smaller levels, ranging from the regions composing the country to macro-areas (North, Center, South).

 

http://dx.doi.org/10.1371/journal.pone.0095809

Multi-Level Geographical Study of Italian Political Elections from Twitter Data

Caldarelli G, Chessa A, Pammolli F, Pompa G, Puliga M, et al.

PLoS ONE 9(5): e95809 (2014)


Via Complexity Digest
more...
No comment yet.
Edgar Analytics & Complex Systems
A space to Scoop about Big Data and Complexity
Your new post is loading...
Your new post is loading...
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

The future cities agenda

Suddenly, ‘cities’ have become the hottest topic on the planet. National research institutes and local governments as well as various global agencies are all scrambling to get a piece of the action as cities become the places where it is considered future economic prosperity firmly lies while also offering the prospect of rescuing a developed world mired in recession.

 

Batty M, 2013, "The future cities agenda" Environment and Planning B: Planning and Design 40(2) 191 – 194 

http://dx.doi.org/10.1068/b4002ed


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

Computational rationality: A converging paradigm for intelligence in brains, minds, and machines

After growing up together, and mostly growing apart in the second half of the 20th century, the fields of artificial intelligence (AI), cognitive science, and neuroscience are reconverging on a shared view of the computational foundations of intelligence that promotes valuable cross-disciplinary exchanges on questions, methods, and results. We chart advances over the past several decades that address challenges of perception and action under uncertainty through the lens of computation. Advances include the development of representations and inferential procedures for large-scale probabilistic inference and machinery for enabling reflection and decisions about tradeoffs in effort, precision, and timeliness of computations. These tools are deployed toward the goal of computational rationality: identifying decisions with highest expected utility, while taking into consideration the costs of computation in complex real-world problems in which most relevant calculations can only be approximated. We highlight key concepts with examples that show the potential for interchange between computer science, cognitive science, and neuroscience.

 

Computational rationality: A converging paradigm for intelligence in brains, minds, and machines
Samuel J. Gershman, Eric J. Horvitz, Joshua B. Tenenbaum

Science 17 July 2015:
Vol. 349 no. 6245 pp. 273-278
http://dx.doi.org/10.1126/science.aac6076


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from COMPUTATIONAL THINKING and CYBERLEARNING
Scoop.it!

21st Century Literacy | School of Interactive Computing

21st Century Literacy | School of Interactive Computing | Edgar Analytics & Complex Systems | Scoop.it

Via Paul Herring, Bonnie Bracey Sutton
more...
Paul Herring's curator insight, July 23, 11:04 PM

“I believe that computing is a new kind of literacy that is critical for all professions in the 21st century,” says Guzdial. “If I'm right, doing computing education well is as important as doing mathematics or physics education well and needs a similar level and kind of support.”

Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

Topological data analysis of contagion maps for examining spreading processes on networks

Topological data analysis of contagion maps for examining spreading processes on networks | Edgar Analytics & Complex Systems | Scoop.it

Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth’s surface; however, in modern contagions long-range edges—for example, due to airline transportation or communication media—allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct ‘contagion maps’ that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.

 

Nature Communications 6, Article number: 7723 http://dx.doi.org/10.1038/ncomms8723 


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from CxAnnouncements
Scoop.it!

PHD Symposium, Hannover and Berlin, ISSS and WINS: Humboldt University | International Society for the Systems Sciences

Announcement - PhD Course at ISSS2015 in co-operation with WINS at Humboldt University
Systems Thinking and Practice in PhD Research: Cybersystemic Possibilities for Governing the Anthropocene
30 July – 7 August 2015, Germany
A joint programme designed by ISSS and the Berlin Workshop in Institutional Analysis of Social-Ecological Systems - WINS 
• Two days of participation in a Systemic Inquiry in Hannover (Herrenhausen) on “Governing the Anthropocene: Cybersystemic Possibilities?
• Two days of dedicated ‘workshops’ introducing different systems approaches, methods and research traditions at Humboldt University in Berlin
• Five days of participation in the 2015 ISSS Conference in Berlin, including a group generated presentation on the final day
• 5 ECTS - points

 

http://isss.org/world/PhD_Course


Via Complexity Digest
more...
No comment yet.
Scooped by Nuno Edgar Fernandes
Scoop.it!

Computer Science, Statistical Methods Combine to Analyze Stunningly Diverse ... - Scientific Computing

Computer Science, Statistical Methods Combine to Analyze Stunningly Diverse ... - Scientific Computing | Edgar Analytics & Complex Systems | Scoop.it
A multi-year study led by researchers from the Simons Center for Data Analysis and major universities and medical schools has broken substantial new ground, establishing how genes work together within 144 different human tissues and cell types in...
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

Machine ethics: The robot’s dilemma

Machine ethics: The robot’s dilemma | Edgar Analytics & Complex Systems | Scoop.it

In his 1942 short story 'Runaround', science-fiction writer Isaac Asimov introduced the Three Laws of Robotics — engineering safeguards and built-in ethical principles that he would go on to use in dozens of stories and novels. They were: 1) A robot may not injure a human being or, through inaction, allow a human being to come to harm; 2) A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law; and 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Fittingly, 'Runaround' is set in 2015. Real-life roboticists are citing Asimov's laws a lot these days: their creations are becoming autonomous enough to need that kind of guidance. In May, a panel talk on driverless cars at the Brookings Institution, a think tank in Washington DC, turned into a discussion about how autonomous vehicles would behave in a crisis. What if a vehicle's efforts to save its own passengers by, say, slamming on the brakes risked a pile-up with the vehicles behind it? Or what if an autonomous car swerved to avoid a child, but risked hitting someone else nearby?

 

http://www.nature.com/news/machine-ethics-the-robot-s-dilemma-1.17881


Via Complexity Digest
more...
Hakushi Hamaoka's curator insight, Today, 11:18 AM

The concerns about the invisibility of what robots determine can better be resolved by enabling communications between robots and humans. By extension of the 'machine learning' approach, 'learning human language' seems worthwhile being attempted :)

Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

From Entropy to Information: Biased Typewriters and the Origin of Life

The origin of life can be understood mathematically to be the origin of information that can replicate. The likelihood that entropy spontaneously becomes information can be calculated from first principles, and depends exponentially on the amount of information that is necessary for replication. We do not know what the minimum amount of information for self-replication is because it must depend on the local chemistry, but we can study how this likelihood behaves in different known chemistries, and we can study ways in which this likelihood can be enhanced. Here we present evidence from numerical simulations (using the digital life chemistry "Avida") that using a biased probability distribution for the creation of monomers (the "biased typewriter") can exponentially increase the likelihood of spontaneous emergence of information from entropy. We show that this likelihood may depend on the length of the sequence that the information is embedded in, but in a non-trivial manner: there may be an optimum sequence length that maximizes the likelihood. We conclude that the likelihood of spontaneous emergence of self-replication is much more malleable than previously thought, and that the biased probability distributions of monomers that are the norm in biochemistry may significantly enhance these likelihoods

 

From Entropy to Information: Biased Typewriters and the Origin of Life
Christoph Adami, Thomas LaBar

http://arxiv.org/abs/1506.06988


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

Cascades in multiplex financial networks with debts of different seniority

Cascades in multiplex financial networks with debts of different seniority | Edgar Analytics & Complex Systems | Scoop.it

A model of a banking network predicts the balance of high- and low-priority debts that ensures financial stability.

 

Synopsis: http://physics.aps.org/synopsis-for/10.1103/PhysRevE.91.062813

 

Cascades in multiplex financial networks with debts of different seniority

 

The seniority of debt, which determines the order in which a bankrupt institution repays its debts, is an important and sometimes contentious feature of financial crises, yet its impact on systemwide stability is not well understood. We capture seniority of debt in a multiplex network, a graph of nodes connected by multiple types of edges. Here an edge between banks denotes a debt contract of a certain level of seniority. Next we study cascading default. There exist multiple kinds of bankruptcy, indexed by the highest level of seniority at which a bank cannot repay all its debts. Self-interested banks would prefer that all their loans be made at the most senior level. However, mixing debts of different seniority levels makes the system more stable in that it shrinks the set of network densities for which bankruptcies spread widely. We compute the optimal ratio of senior to junior debts, which we call the optimal seniority ratio, for two uncorrelated Erdős-Rényi networks. If institutions erode their buffer against insolvency, then this optimal seniority ratio rises; in other words, if default thresholds fall, then more loans should be senior. We generalize the analytical results to arbitrarily many levels of seniority and to heavy-tailed degree distributions.

 

Charles D. Brummitt and Teruyoshi Kobayashi

Phys. Rev. E 91, 062813 (2015)

Published June 24, 2015


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

Nature Physics Focus on Complex networks in finance

The 2008 financial crisis has highlighted major limitations in the modelling of financial and economic systems. However, an emerging field of research at the frontiers of both physics and economics aims to provide a more fundamental understanding of economic networks, as well as practical insights for policymakers. In this Nature Physics Focus, physicists and economists consider the state-of-the-art in the application of network science to finance.

 

http://www.nature.com/nphys/journal/v9/n3/index.html


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

The evolution of lossy compression

In complex environments, there are costs to both ignorance and perception. An organism needs to track fitness-relevant information about its world, but the more information it tracks, the more resources it must devote to memory and processing. Rate-distortion theory shows that, when errors are allowed, remarkably efficient internal representations can be found by biologically-plausible hill-climbing mechanisms. We identify two regimes: a high-fidelity regime where perceptual costs scale logarithmically with environmental complexity, and a low-fidelity regime where perceptual costs are, remarkably, independent of the environment. When environmental complexity is rising, Darwinian evolution should drive organisms to the threshold between the high- and low-fidelity regimes. Organisms that code efficiently will find themselves able to make, just barely, the most subtle distinctions in their environment.

 

The evolution of lossy compression
Sarah E. Marzen, Simon DeDeo

http://arxiv.org/abs/1506.06138


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from RJI links
Scoop.it!

'Native advertising is the future of newspaper advertising.'

'Native advertising is the future of newspaper advertising.' | Edgar Analytics & Complex Systems | Scoop.it
native advertising is the future of newspaper advertising.

Via Brian Steffens
more...
Brian Steffens's curator insight, June 10, 12:32 AM

"... I learned that I want to be part of finding the solution for the future growth of our industry. I’m no longer content to sit by the sidelines with the ‘woe is newspapers’ crowd." -- 2014-2015 RJI Fellow Jaci Smith.

Rescooped by Nuno Edgar Fernandes from CxConferences
Scoop.it!

Satellite Meeting INFORMATION PROCESSING IN COMPLEX SYSTEMS (IPCS'15)

Satellite Meeting INFORMATION PROCESSING IN COMPLEX SYSTEMS (IPCS'15) | Edgar Analytics & Complex Systems | Scoop.it

CCS'15 Satellite Meeting: Information Processing in Complex Systems (IPCS'15)

Abstracts due:     June 20
Decision of admission:     June 25
Satellite meeting:     October 1

 

All systems in nature have one thing in common: they process information. Information is registered in the state of a system and its elements, implicitly and invisibly. As elements interact, information is transferred. Indeed, bits of information about the state of one element will travel – imperfectly – to the state of the other element, forming its new state. This storage and transfer of information, possibly between levels of a multi level system, is imperfect due to randomness or noise. From this viewpoint, a system can be formalized as a collection of bits that is organized according to its rules of dynamics and its topology of interactions. Mapping out exactly how these bits of information percolate through the system could reveal new fundamental insights in how the parts orchestrate to produce the properties of the system. A theory of information processing would be capable of defining a set of universal properties of dynamical multi level complex systems, which describe and compare the dynamics of diverse complex systems ranging from social interaction to brain networks, from financial markets to biomedicine. Each possible combination of rules of dynamics and topology of interactions, with disparate semantics, would reduce to a single language of information processing.


Via Complexity Digest
more...
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

The dynamic of information-driven coordination phenomena: a transfer entropy analysis

Data from social media are providing unprecedented opportunities to investigate the processes that rule the dynamics of collective social phenomena. Here, we consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of micro-blogging time series to extract directed networks of influence among geolocalized sub-units in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time-scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social sub-units. In the absence of a clear exogenous driving, social collective phenomena can be represented as endogenously-driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data.

 

The dynamic of information-driven coordination phenomena: a transfer entropy analysis
Javier Borge-Holthoefer, Nicola Perra, Bruno Gonçalves, Sandra González-Bailón, Alex Arenas, Yamir Moreno, Alessandro Vespignani

http://arxiv.org/abs/1507.06106


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

Information thermodynamics of near-equilibrium computation

In studying fundamental physical limits and properties of computational processes, one is faced with the challenges of interpreting primitive information-processing functions through well-defined information-theoretic as well as thermodynamic quantities. In particular, transfer entropy, characterizing the function of computational transmission and its predictability, is known to peak near critical regimes. We focus on a thermodynamic interpretation of transfer entropy aiming to explain the underlying critical behavior by associating information flows intrinsic to computational transmission with particular physical fluxes. Specifically, in isothermal systems near thermodynamic equilibrium, the gradient of the average transfer entropy is shown to be dynamically related to Fisher information and the curvature of system's entropy. This relationship explicitly connects the predictability, sensitivity, and uncertainty of computational processes intrinsic to complex systems and allows us to consider thermodynamic interpretations of several important extreme cases and trade-offs.

 

Information thermodynamics of near-equilibrium computation
Mikhail Prokopenko and Itai Einav
Phys. Rev. E 91, 062143

http://dx.doi.org/10.1103/PhysRevE.91.062143


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

Modeling evolutionary games in populations with demographic structure

Classic life history models are often based on optimization algorithms, focusing on the adaptation of survival and reproduction to the environment, while neglecting frequency dependent interactions in the population. Evolutionary game theory, on the other hand, studies frequency dependent strategy interactions, but usually omits life history and the demographic structure of the population. Here we show how an integration of both aspects can substantially alter the underlying evolutionary dynamics.


Via Complexity Digest
more...
Marcelo Errera's curator insight, July 23, 11:46 PM

This is a model that most likely will show the evolution of configuration (social organization) in the direction to facilitate some current that is imposed to the population.

David M. Boje's curator insight, July 27, 8:47 AM

Antenarrative combines life history and life-future with the prereflexive threads of intention that both story and narrative omit

Rescooped by Nuno Edgar Fernandes from CxAnnouncements
Scoop.it!

Two PhD fellowships in "Complexity Economics" @UGent

The PhD student will work within the larger LAB-M project "Live agent based models and the theory of adaptive multi-type and multilayer networks to study economic complexity and financial instability", funded by Ghent University and by the Flemish Fund for Scientific Research (FWO-Vlaanderen). The LAB-M project aims to model interactions of real persons with real incentives in massive multiplayer online games (MMOG) with realistic economic environments to test a wide plethora of economic and political theories and develop new theories rooted in network theory and methodology. The LAB-M world provides natural experiments and collects multi-faceted data on the interaction of people in adaptive multilayer networks. We will develop a new methodological language for social physics, rooted in adaptive multi-type multilayer networks. The PhD students will be based at the Economics and Physics Department of Ghent University (Belgium), and will be part of a research team, coordinated by Prof. Jan Ryckebusch and Prof. Koen Schoors. The PhD student will also be guided by international external advisors. 

 

http://inwpent5.ugent.be/Vacancies


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from CxAnnouncements
Scoop.it!

Postdoctoral position in complex systems (Econophysics)

Are you fascinated by interdisciplinary work? — Are you into data analysis and model building? If yes, you might be interested in this position. In the last decades Econophysics emerged as a new, interdisciplinary field. Our group has longstanding expertise. We develop models for various issues in the economy, particularly in the financial markets. We apply the same standards as in traditional physics and base our models as much as possible on the empirical information.

 

University of Duisburg-Essen

http://www.theo.physik.uni-due.de/tp/ags/guhr_dir/positions.php?lang=en


Via Complexity Digest
more...
A. J. Alvarez-Socorro's curator insight, June 23, 2:16 AM

Becas / Scholarships / Fellowships

Rescooped by Nuno Edgar Fernandes from COMPUTATIONAL THINKING and CYBERLEARNING
Scoop.it!

Codecademy Summer of Code

Codecademy Summer of Code | Edgar Analytics & Complex Systems | Scoop.it
RT @margotcodes: Know any motivated students? Get them in the coding game with our #CCSummerOfCode Challenge: http://t.co/tdcJtIidfY http:/…

Via Bonnie Bracey Sutton
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

25 Years of Self-Organized Criticality: Numerical Detection Methods

The detection and characterization of self-organized criticality (SOC), in both real and simulated data, has undergone many significant revisions over the past 25 years. The explosive advances in the many numerical methods available for detecting, discriminating, and ultimately testing, SOC have played a critical role in developing our understanding of how systems experience and exhibit SOC. In this article, methods of detecting SOC are reviewed; from correlations to complexity to critical quantities. A description of the basic autocorrelation method leads into a detailed analysis of application-oriented methods developed in the last 25 years. In the second half of this manuscript space-based, time-based and spatial-temporal methods are reviewed and the prevalence of power laws in nature is described, with an emphasis on event detection and characterization. The search for numerical methods to clearly and unambiguously detect SOC in data often leads us outside the comfort zone of our own disciplines - the answers to these questions are often obtained by studying the advances made in other fields of study. In addition, numerical detection methods often provide the optimum link between simulations and experiments in scientific research. We seek to explore this boundary where the rubber meets the road, to review this expanding field of research of numerical detection of SOC systems over the past 25 years, and to iterate forwards so as to provide some foresight and guidance into developing breakthroughs in this subject over the next quarter of a century.

 

 

25 Years of Self-Organized Criticality: Numerical Detection Methods
R.T. James McAteer, Markus J. Aschwanden, Michaila Dimitropoulou, Manolis K. Georgoulis, Gunnar Pruessner, Laura Morales, Jack Ireland, Valentyna Abramenko

http://arxiv.org/abs/1506.08142 ;


Via Complexity Digest
more...
Marcelo Errera's curator insight, July 25, 11:12 AM

What is complexity ? Is it an effect caused by some principle ? Is it the cause of something else ?  Those questions are fundamental.

By the way, CL publications have been showing power law, not exponential are the correct mathematical expression for most natural phenomena.

Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

A Neural Conversational Model

Conversational modeling is an important task in natural language understanding and machine intelligence. Although previous approaches exist, they are often restricted to specific domains (e.g., booking an airline ticket) and require hand-crafted rules. In this paper, we present a simple approach for this task which uses the recently proposed sequence to sequence framework. Our model converses by predicting the next sentence given the previous sentence or sentences in a conversation. The strength of our model is that it can be trained end-to-end and thus requires much fewer hand-crafted rules. We find that this straightforward model can generate simple conversations given a large conversational training dataset. Our preliminary suggest that, despite optimizing the wrong objective function, the model is able to extract knowledge from both a domain specific dataset, and from a large, noisy, and general domain dataset of movie subtitles. On a domain-specific IT helpdesk dataset, the model can find a solution to a technical problem via conversations. On a noisy open-domain movie transcript dataset, the model can perform simple forms of common sense reasoning. As expected, we also find that the lack of consistency is a common failure mode of our model.

 

A Neural Conversational Model
Oriol Vinyals, Quoc Le

http://arxiv.org/abs/1506.05869


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

The evolutionary advantage of cooperation

The present study asks how cooperation and consequently structure can emerge in many different evolutionary contexts. Cooperation, here, is a persistent behavioural pattern of individual entities pooling and sharing resources. Examples are: individual cells forming multicellular systems whose various parts pool and share nutrients; pack animals pooling and sharing prey; families firms, or modern nation states pooling and sharing financial resources. In these examples, each atomistic decision, at a point in time, of the better-off entity to cooperate poses a puzzle: the better-off entity will book an immediate net loss -- why should it cooperate? For each example, specific explanations have been put forward. Here we point out a very general mechanism -- a sufficient null model -- whereby cooperation can evolve. The mechanism is based the following insight: natural growth processes tend to be multiplicative. In multiplicative growth, ergodicity is broken in such a way that fluctuations have a net-negative effect on the time-average growth rate, although they have no effect on the growth rate of the ensemble average. Pooling and sharing resources reduces fluctuations, which leaves ensemble averages unchanged but -- contrary to common perception -- increases the time-average growth rate for each cooperator.

 

The evolutionary advantage of cooperation
Ole Peters, Alexander Adamou

http://arxiv.org/abs/1506.03414


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

John Forbes Nash Jr. (1928–2015)

John Forbes Nash Jr. (1928–2015) | Edgar Analytics & Complex Systems | Scoop.it

In the fall of 1949, many graduate students at Princeton University were assigned rooms in the Graduate College. In one suite, John Nash inhabited a single room, while I shared the double with Lloyd Shapley. John and Lloyd were the mathematicians and I was the economist, and together we pursued our interest in game theory. John was one of the youngest students at the Graduate College. He was from West Virginia, where his father was an engineer and his mother a Latin teacher. He graduated from the Carnegie Institute of Technology with bachelor's and master's degrees in mathematics, and arrived at the math department in Princeton in 1948.

 

John Forbes Nash Jr. (1928–2015)
Martin Shubik

Science 19 June 2015:
Vol. 348 no. 6241 p. 1324
http://dx.doi.org/10.1126/science.aac7085 ;


Via Complexity Digest
more...
Marcelo Errera's curator insight, June 23, 10:46 PM

His legacy went beyond Math. I wonder if game theory leads to configurations (organizations) that facilitate the flow.

Configurations would be otherwise unbalanced, unstable and likely to be surpassed.

Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

The non-linear health consequences of living in larger cities

Urbanization promotes economy, mobility, access and availability of resources, but on the other hand, generates higher levels of pollution, violence, crime, and mental distress. The health consequences of the agglomeration of people living close together are not fully understood. Particularly, it remains unclear how variations in the population size across cities impact the health of the population. We analyze the deviations from linearity of the scaling of several health-related quantities, such as the incidence and mortality of diseases, external causes of death, wellbeing, and health-care availability, in respect to the population size of cities in Brazil, Sweden and the USA. We find that deaths by non-communicable diseases tend to be relatively less common in larger cities, whereas the per-capita incidence of infectious diseases is relatively larger for increasing population size. Healthier life style and availability of medical support are disproportionally higher in larger cities. The results are connected with the optimization of human and physical resources, and with the non-linear effects of social networks in larger populations. An urban advantage in terms of health is not evident and using rates as indicators to compare cities with different population sizes may be insufficient.

 

The non-linear health consequences of living in larger cities
Luis E. C. Rocha, Anna E. Thorson, Renaud Lambiotte

http://arxiv.org/abs/1506.02735


Via Complexity Digest
more...
Rescooped by Nuno Edgar Fernandes from Papers
Scoop.it!

Exposure to ideologically diverse news and opinion on Facebook

Exposure to news, opinion, and civic information increasingly occurs through social media. How do these online networks influence exposure to perspectives that cut across ideological lines? Using deidentified data, we examined how 10.1 million U.S. Facebook users interact with socially shared news. We directly measured ideological homophily in friend networks and examined the extent to which heterogeneous friends could potentially expose individuals to cross-cutting content. We then quantified the extent to which individuals encounter comparatively more or less diverse content while interacting via Facebook’s algorithmically ranked News Feed and further studied users’ choices to click through to ideologically discordant content. Compared with algorithmic ranking, individuals’ choices played a stronger role in limiting exposure to cross-cutting content.

 

Exposure to ideologically diverse news and opinion on Facebook
Eytan Bakshy, Solomon Messing, Lada A. Adamic

Science 5 June 2015:
Vol. 348 no. 6239 pp. 1130-1132
http://dx.doi.org/10.1126/science.aaa1160


Via Complexity Digest
more...
No comment yet.
Rescooped by Nuno Edgar Fernandes from RJI links
Scoop.it!

21 Essential Data Visualization Tools

21 Essential Data Visualization Tools | Edgar Analytics & Complex Systems | Scoop.it
If you are a data lover, then you must check the 21 data visualization tools we curated for you. Most of them are free and easy to use.

Via Brian Steffens
more...
Brian Steffens's curator insight, June 2, 6:26 PM

You're all skilled computer graphics creators, right? No, then check out this list of tools.