New genetic research reveals that a small group of hunter-gatherers now living in Southern Africa once was so large that it comprised the majority of living humans during most of the past 150,000 years. Only during the last 22,000 years have the other African ethnicities, including the ones giving rise to Europeans and Asians, become vastly most numerous. Now the Khoisan (who sometimes call themselves Bushmen) number about 100,000 individuals, while the rest of humanity numbers 7 billion. Their lives and ways have remained unaltered for hundreds of generations, with only recent events endangering their hunter-gatherer lifestyles. The study's findings is published in Nature Communications on 4 December 2014.
As California finally experiences the arrival of a rain-bearing Pineapple Express this week, two climate scientists have shown that the drought of 2012-2014 has been the worst in 1,200 years.
Daniel Griffin, an assistant professor in the Department of Geography, Environment and Society at the University of Minnesota, and Kevin Anchukaitis, an assistant scientist at Woods Hole Oceanographic Institution, asked the question, "How unusual is the ongoing California drought?" Watching the severity of the California drought intensify since last autumn, they wondered how it would eventually compare to other extreme droughts throughout the state's history.
To answer those questions, Griffin and Anchukaitis collected new tree-ring samples from blue oak trees in southern and central California. "California's old blue oaks are as close to nature's rain gauges as we get," says Griffin. "They thrive in some of California's driest environments." These trees are particularly sensitive to moisture changes and their tree rings display moisture fluctuations vividly.
As soon as the National Oceanic and Atmospheric Administration (NOAA) released climate data for the summer of 2014, the two scientists sprang into action. Using their blue oak data, they reconstructed rainfall back to the 13th century. They also calculated the severity of the drought by combining NOAA's estimates of the Palmer Drought Severity Index (PDSI), an index of soil moisture variability, with the existing North American Drought Atlas, a spatial tree-ring based reconstruction of drought developed by scientists at Columbia University's Lamont-Doherty Earth Observatory. These resources together provided complementary data on rainfall and soil moisture over the past millennium. Griffin and Anchukaitis found that while the current period of low precipitation is not unusual in California's history, these rainfall deficits combined with sustained record high temperatures created the current multiyear severe water shortages. "While it is precipitation that sets the rhythm of California drought, temperature weighs in on the pitch," says Anchukaitis.
"We were genuinely surprised at the result," says Griffin, a NOAA Climate & Global Change Fellow and former WHOI postdoctoral scholar. "This is California--drought happens. Time and again, the most common result in tree-ring studies is that drought episodes in the past were more extreme than those of more recent eras. This time, however, the result was different." While there is good evidence of past sustained, multi-decadal droughts or so-called "megadroughts"' in California, the authors say those past episodes were probably punctuated by occasional wet years, even if the cumulative effect over decades was one of overall drying. The current short-term drought appears to be worse than any previous span of consecutive years of drought without reprieve.
Tree rings are a valuable data source when tracking historical climate, weather and natural disaster trends. Floods, fires, drought and other elements that can affect growing conditions are reflected in the development of tree rings, and since each ring represents one year the samples collected from centuries-old trees are a virtual timeline that extend beyond the historical record in North America.
Tentative new work from Julian Barbour of the University of Oxford, Tim Koslowski of the University of New Brunswick and Flavio Mercati of the Perimeter Institute for Theoretical Physics suggests that perhaps the arrow of time doesn’t really require a fine-tuned, low-entropy initial state at all but is instead the inevitable product of the fundamental laws of physics. Barbour and his colleagues argue that it is gravity, rather than thermodynamics, that draws the bowstring to let time’s arrow fly. Their findings were published in October in Physical Review Letters.
The team’s conclusions come from studying an exceedingly simple proxy for our universe, a computer simulation of 1,000 pointlike particles interacting under the influence of Newtonian gravity. They investigated the dynamic behavior of the system using a measure of its "complexity," which corresponds to the ratio of the distance between the system’s closest pair of particles and the distance between the most widely separated particle pair. The system’s complexity is at its lowest when all the particles come together in a densely packed cloud, a state of minimum size and maximum uniformity roughly analogous to the big bang. The team’s analysis showed that essentially every configuration of particles, regardless of their number and scale, would evolve into this low-complexity state. Thus, the sheer force of gravity sets the stage for the system’s expansion and the origin of time’s arrow, all without any delicate fine-tuning to first establish a low-entropy initial condition.
From that low-complexity state, the system of particles then expands outward in both temporal directions, creating two distinct, symmetric and opposite arrows of time. Along each of the two temporal paths, gravity then pulls the particles into larger, more ordered and complex structures—the model’s equivalent of galaxy clusters, stars and planetary systems. From there, the standard thermodynamic passage of time can manifest and unfold on each of the two divergent paths. In other words, the model has one past but two futures. As hinted by the time-indifferent laws of physics, time’s arrow may in a sense move in two directions, although any observer can only see and experience one. “It is the nature of gravity to pull the universe out of its primordial chaos and create structure, order and complexity,” Mercati says. “All the solutions break into two epochs, which go on forever in the two time directions, divided by this central state which has very characteristic properties.”
Although the model is crude, and does not incorporate either quantum mechanics or general relativity, its potential implications are vast. If it holds true for our actual universe, then the big bang could no longer be considered a cosmic beginning but rather only a phase in an effectively timeless and eternal universe. More prosaically, a two-branched arrow of time would lead to curious incongruities for observers on opposite sides. “This two-futures situation would exhibit a single, chaotic past in both directions, meaning that there would be essentially two universes, one on either side of this central state,” Barbour says. “If they were complicated enough, both sides could sustain observers who would perceive time going in opposite directions. Any intelligent beings there would define their arrow of time as moving away from this central state. They would think we now live in their deepest past.”
We live in chaotic times. Many feel that our fragile economy could come crashing down at any time. One devastating terrorist attack, false flag attack or natural disaster could lead to an unprecedented disaster and martial law would be declared. Some Americans would take to the streets and the only remaining question is whether or not American soldiers, called to the scene, would restore order by firing upon American citizens when ordered to do so?
This scenario and the resulting public execution of American citizens for engaging in protesting has happened many times in our past. For those old enough to remember, the 1970 Kent State massacre should come to mind as the Ohio National Guard opened fire on protesting college students on the campus of Kent State University. But for those who believe that this was merely an anomaly, let’s examine what the field of psychology has discovered about the answer to this question.
We provide direct evidence of market manipulation at the beginning of the financial crisis in November 2007. The type of market manipulation, a "bear raid," would have been prevented by a regulation that was repealed by the Securities and Exchange Commission in July 2007. The regulation, the uptick rule, was designed to prevent market manipulation and promote stability and was in force from 1938 as a key part of the government response to the 1929 market crash and its aftermath. On November 1, 2007, Citigroup experienced an unusual increase in trading volume and decrease in price. Our analysis of financial industry data shows that this decline coincided with an anomalous increase in borrowed shares, the selling of which would be a large fraction of the total trading volume. The selling of borrowed shares cannot be explained by news events as there is no corresponding increase in selling by share owners. A similar number of shares were returned on a single day six days later. The magnitude and coincidence of borrowing and returning of shares is evidence of a concerted effort to drive down Citigroup's stock price and achieve a profit, i.e., a bear raid. Interpretations and analyses of financial markets should consider the possibility that the intentional actions of individual actors or coordinated groups can impact market behavior. Markets are not sufficiently transparent to reveal or prevent even major market manipulation events. Our results point to the need for regulations that prevent intentional actions that cause markets to deviate from equilibrium value and contribute to market crashes. Enforcement actions, even if they take place, cannot reverse severe damage to the economic system. The current "alternative" uptick rule which is only in effect for stocks dropping by over 10% in a single day is insufficient. Improved availability of market data and reinstatement of either the original uptick rule or other transaction limitations may help prevent market instability.
Prediction markets are markets for contracts that yield payments based on the outcome of an uncertain future event, such as a presidential election. Using these markets as forecasting tools could substantially improve decision making in the private and public sectors.
We argue that U.S. regulators should lower barriers to the creation and design of prediction markets by creating a safe harbor for certain types of small stakes markets. We believe our proposed change has the potential to stimulate innovation in the design and use of prediction markets throughout the economy, and in the process to provide information that will benefit the private sector and government alike.
A mathematical model (Core Model) is presented that describes the gross dynamic behavior of the demographic transition—falling death rates lead to population increase, temporarily rising birth rates, temporarily increased population growth, decreased
Scientists have taken an important step towards the possibility of creating synthetic life with the development of a form of artificial evolution in a simple chemistry set without DNA.
A team from the University of Glasgow’s School of Chemistry report in a new paper in the journal Nature Communications today (Monday 8 December) on how they have managed to create an evolving chemical system for the first time. The process uses a robotic ‘aid’ and could be used in the future to ‘evolve’ new chemicals capable of performing specific tasks.
The researchers used a specially-designed open source robot based upon a cheap 3D printer to create and monitor droplets of oil in water-filled Petri dishes in their lab. Each droplet was composed from a slightly different mixture of four chemical compounds.
Droplets of oil move in water like primitive chemical machines, transferring chemical energy to kinetic energy. The researchers’ robot used a video camera to monitor, process and analyse the behaviour of 225 differently-composed droplets, identifying a number of distinct characteristics such as vibration or clustering.
The team picked out three types of droplet behavior – division, movement and vibration – to focus on in the next stage of the research. They used the robot to deposit four droplets of the same composition, then ranked the droplets in order of how closely they fit the criteria of behaviour identified by the researchers. The chemical composition of the ‘fittest’ droplet was then carried over into a second generation of droplets, and the process of robotic selection was begun again.
Over the course of 20 repetitions of the process, the researchers found that the droplets became more stable, mimicking the natural selection of evolution.
The research team was led by Professor Lee Cronin, the University of Glasgow’s Regius Chair of Chemistry. Professor Cronin said: “This is the first time that an evolvable chemical system has existed outside of biology. Biological evolution has given rise to enormously complex and sophisticated forms of life, and our robot-driven form of evolution could have the potential to do something similar for chemical systems.
The international health community is celebrating what may prove to be a turning point in the global fight against malaria. Deaths from the mosquito-borne disease have been almost halved since the turn of the millennium, according to a new report from the World Health Organization (WHO), with experts saying they’re confident the illness can one day be eradicated entirely.
However, although the malaria mortality rate fell by 47 percent globally and by 54 percent in Africa, the WHO warns that much more still needs to be done. Dozens of countries are reporting insecticide-resistance among their mosquito populations and in Africa — where 90 percent of malaria deaths occur — some 278 million people lack even the basic protection of an insecticide-treated mosquito net.
The disease also continues to disproportionately affect children in poor countries. Of the estimated global malaria death toll of 584,000 in 2013, some 437,000 of those cases were African children under the age of five. However, malaria infections in the African continent have decreased significantly since the year 2000, falling by 23 percent from 173 million to 128 million.
The WHO attributes these gains to the increased spread of established methods, including rapid diagnostic tests (which have risen globally from 46 million 319 million over the past five years); malarial treatment using artemisnin (392 million treatments were bought last year, up from 11 million in 2004); and access to insecticide-treated nets (427 million of which have been distributed in the last two years).
Cooperation among unrelated individuals is frequently observed in social groups when their members combine efforts and resources to obtain a shared benefit that is unachievable by an individual alone. However, understanding why cooperation arises despite the natural tendency of individuals toward selfish behavior is still an open problem and represents one of the most fascinating challenges in evolutionary dynamics. Recently, the structural characterization of the networks in which social interactions take place has shed some light on the mechanisms by which cooperative behavior emerges and eventually overcomes the natural temptation to defect. In particular, it has been found that the heterogeneity in the number of social ties and the presence of tightly knit communities lead to a significant increase in cooperation as compared with the unstructured and homogeneous connection patterns considered in classical evolutionary dynamics. Here, we investigate the role of social-ties dynamics for the emergence of cooperation in a family of social dilemmas. Social interactions are in fact intrinsically dynamic, fluctuating, and intermittent over time, and they can be represented by time-varying networks. By considering two experimental data sets of human interactions with detailed time information, we show that the temporal dynamics of social ties has a dramatic impact on the evolution of cooperation: the dynamics of pairwise interactions favors selfish behavior.
Evolutionary dynamics of time-resolved social interactions Phys. Rev. E 90, 052825 – Published 25 November 2014 Alessio Cardillo, Giovanni Petri, Vincenzo Nicosia, Roberta Sinatra, Jesús Gómez-Gardeñes, and Vito Latora
Abstract: A growing body of evidence demonstrates that in some contexts and for identifiable reasons, people make choices that are not in their interest, even when the stakes are high. Policymakers in a number of nations, including the United States and the United Kingdom, have used the underlying evidence to inform regulatory initiatives and choice architecture in a number of domains. Both the resulting actions and the relevant findings have raised the question whether an understanding of human errors opens greater space for paternalism. Behavioral market failures, which occur as a result of such errors, are an important supplement to the standard account of market failures; if promoting welfare is the guide, then behavioral market failures should be taken into consideration, even if the resulting actions are paternalistic. A general principle of behaviorally informed regulation – its first law – is that the appropriate responses to behavioral market failures usually consist of nudges, generally in the form of disclosure, warnings, and default rules. While some people invoke autonomy as an objection to paternalism, the strongest objections are welfarist in character. Official action may fail to respect heterogeneity, may diminish learning and self-help, may be subject to pressures from self-interested private groups (the problem of “behavioral public choice”), and may reflect the same errors that ordinary people make. The welfarist arguments against paternalism have considerable force, but choice architecture, and sometimes a form of paternalism, are inevitable, and to that extent the welfarist objections cannot get off the ground. Where paternalism is optional, the objections, though reasonable, depend on empirical assumptions that may not hold in identifiable contexts. There are many opportunities for improving human welfare through improved choice architecture.
By W. Brian Arthur; External Professor, Santa Fe Institute; Visiting Researcher, Palo Alto Research Center.
Economics is a stately subject, one that has altered little since its modern foundations were laid in Victorian times. Now it is changing radically. Standard economics is suddenly being challenged by a number of new approaches: behavioral economics, neuroeconomics, new institutional economics. One of the new approaches came to life at the Santa Fe Institute: complexity economics.
Complexity economics got its start in 1987 when a now-famous conference of scientists and economists convened by physicist Philip Anderson and economist Kenneth Arrow met to discuss the economy as an evolving complex system. That conference gave birth a year later to the Institute’s first research program – the Economy as an Evolving Complex System – and I was asked to lead this. That program in turn has gone on to lay down a new and different way to look at the economy.
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.