Amazing Science
335.3K views | +152 today
Scooped by Dr. Stefan Gruenwald
onto Amazing Science!

Most abundant ocean viruses attack bacteria that are important for the carbon cycle

Most abundant ocean viruses attack bacteria that are important for the carbon cycle | Amazing Science |

In one corner is the Earth’s most abundant organism: SAR11, an ocean-living bacterium that survives where most other cells would die and plays a major role in the planet’s carbon cycle. It had been theorized that SAR11 was so small and widespread that it must be invulnerable to attack.


In the other corner, and so strange looking that scientists previously didn’t even recognize what they were, are “Pelagiphages,” viruses now known to infect SAR11 and routinely kill millions of these cells every second.


How this fight turns out is of more than casual interest, because SAR11 has a huge effect on the amount of carbon dioxide that enters the atmosphere, and the overall biology of the oceans.


“There’s a war going on in our oceans, a huge war, and we never even saw it,” says Stephen Giovannoni, a professor of microbiology at Oregon State University. “This is an important piece of the puzzle in how carbon is stored or released in the sea.” The analysis shows that the new viruses—like their hosts—are the most abundant on record.


The paper in Nature describes four previously unknown viruses that infect SAR11. To prove the viruses were as abundant as their hosts, Giovannoni and colleagues teamed up with researchers at the University of Arizona’s Tucson Marine Phage Research Lab, led by Matthew Sullivan, who had developed accurate methods for measuring viral diversity in nature.


The analysis shows that the new viruses—like their hosts—are the most abundant on record. Giovannoni’s group discovered the Pelagiphage viral families by using “old-fashioned” research methods, growing the cells and viruses in a laboratory, instead of the tools of modern genomics, and found the new type of virus.


“Because they are so new, these viruses were virtually unrecognizable to us based on their DNA,” Giovannoni says. “The viruses themselves, of course, appear to be just as abundant as SAR11.”


Sullivan explains the method for discovering viruses in the oceans based on their genomes his group developed over four years is at least 1,000 times more accurate than previous methods.


Their work resulted in the Pacific Ocean Virus dataset. This dataset, Sullivan explains, is the viral equivalent of the Global Ocean Sampling Expedition by former human genome researcher J. Craig Venter, who sailed across the world’s oceans sampling, sequencing, and analyzing the DNA of the microorganisms living in these waters. The new findings on SAR11 disprove the theory that the bacteria are immune to viral predation, Giovannoni and his co-authors say.


“In general, every living cell is vulnerable to viral infection,” says Giovannoni, who first discovered SAR11 in 1990. “What has been so puzzling about SAR11 was its sheer abundance, there was simply so much of it that some scientists believed it must not get attacked by viruses.” What the new research shows, Giovannoni says, is that SAR11 is competitive, good at scavenging organic carbon, and effective at changing quickly to avoid infection. Because of this, it thrives and persists in abundance even though the new viruses are constantly killing it.


No comment yet.
Your new post is loading...
Scooped by Dr. Stefan Gruenwald!

20,000+ FREE Online Science and Technology Lectures from Top Universities

20,000+ FREE Online Science and Technology Lectures from Top Universities | Amazing Science |

NOTE: To subscribe to the RSS feed of Amazing Science, copy into the URL field of your browser and click "subscribe".


This newsletter is aggregated from over 1450 news sources:


All my Tweets and Scoop.It! posts sorted and searchable:



You can search through all the articles semantically on my

archived twitter feed


NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen)  and display all the relevant postings SORTED by TOPICS.


You can also type your own query:


e.g., you are looking for articles involving "dna" as a keyword

Or CLICK on the little FUNNEL symbol at the top right of the screen


MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video 

Casper Pieters's curator insight, March 9, 4:21 PM

Great resources for online learning just about everything.  All you need is will power and self- discipline.

Russ Roberts's curator insight, April 23, 8:37 PM

A very interesting site.  Amazing Science covers many disciplines.  Subscribe to the news letter and be " amazed." Aloha, Russ, KH6JRM. 

Siegfried Holle's curator insight, July 4, 5:45 AM

Your knowledge is your strength and power 

Scooped by Dr. Stefan Gruenwald!

BOSS uses 164,000 quasars to map the expanding universe

BOSS uses 164,000 quasars to map the expanding universe | Amazing Science |

Scientists looking at data from the Baryon Oscillation Spectroscopic Survey (BOSS), the largest program in the third Sloan Digital Sky Survey, have measured the expansion rate of the universe 10.8 billion years ago — a time prior to the onset of accelerated expansion caused by dark energy. The measurement is also the most precise measurement of a universal expansion rate ever made, with only 2% uncertainty. The results were announced at a press conference at the APS’s April meeting on Monday, at the same time that the results were posted on the arXiv.

The rate of universal expansion has changed over the course of the universe’s lifetime. It is believed to have gradually slowed down after the Big Bang, but mysteriously began accelerating again about 7 billion years ago. BOSS and other observatories have previously measured expansion rates going back 6 billion years.

To measure astronomical distances, astronomers will occasionally use so-called “standard candles” – these are supernovae with known luminosities. The difference between the known luminosity and the apparent luminosity indicates the supernova’s distance. The BOSS results measure the expansion factor of the universe using a “standard ruler” – a known distance between celestial objects. The expansion rate can be deduced when the known distance between two objects is compared to their apparent distance and each of their redshifts.

The “standard ruler” in this case is the imprint left over from sound waves in the early universe, also known as baryonic acoustic oscillations, or BAOs. These sound waves from the early universe should have created regularly spaced areas of high and low density in regular matter. For example, scientists see an excess of pairs of galaxies separated by about 450 million light-years – the length of the BAO ruler.

The BOSS experiment looked at the distance between quasars and nearby rings of gas. Quasars – galaxies with a supermassive black hole at their centre – are some of the brightest objects in the universe as a result of radiation from tremendous amounts of material falling into the black hole. Because the quasars formed in areas particularly dense with gas and dust, the imprint of the BAO is strong: it appears as a ring of gas roughly 450 million light-years away from the center of the quasar.

“The scale of that ring is precisely this baryonic acoustic oscillation…and that’s what we’re trying to measure,” says Andreu Font-Ribera of Lawrence Berkeley National Laboratory and one of the authors on the new paper.

Quasars not only provide an imprint of the BAO, they are also some of the most visible objects at such great distances from the earth. Even supernovae or entire galaxies (of the non-quasar variety) are virtually invisible at a distance of 10.8 billion light-years from earth. The BOSS team studied more than 164,000 quasars to make their measurement.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New molecule puts scientists a step closer to understanding hydrogen storage

New molecule puts scientists a step closer to understanding hydrogen storage | Amazing Science |
Australian and Taiwanese scientists have discovered a new molecule which puts the science community one step closer to solving one of the barriers to development of cleaner, greener hydrogen fuel-cells as a viable power source for cars.

Scientists say that the newly-discovered "28copper15hydride" puts us on a path to better understanding hydrogen, and potentially even how to get it in and out of a fuel system, and is stored in a manner which is stable and safe – overcoming Hindenburg-type risks.

"28copper15hydride" is certainly not a name that would be developed by a marketing guru, but while it would send many running for an encyclopaedia (or let's face it, Wikipedia), it has some of the world's most accomplished chemists intrigued.

Its discovery was recently featured on the cover of one of the world's most prestigious chemistry journals, and details are being presented today by Australia's Dr Alison Edwards at the 41st International Conference on Coordination Chemistry, Singapore where 1100 chemists have gathered.

The molecule was synthesised by a team led by Prof Chenwei Liu from the National Dong Hwa University in Taiwan, who developed a partial structure model.

The chemical structure determination was completed by the team at the Australian Nuclear Science and Technology Organisation (ANSTO) using KOALA, one of the world's leading crystallography tools.

Most solid material is made of crystalline structures. The crystals are made up of regular arrangements of atoms stacked up like boxes in a tightly packed warehouse. The science of finding this arrangement, and structure of matter at the atomic level, is crystallography. ANSTO is Australia's home of this science.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Designing nanoparticles that can deliver drugs more easily

Designing nanoparticles that can deliver drugs more easily | Amazing Science |

A new study led by MIT materials scientists reveals the reason why gold nanoparticles  can easily slip through cell membranes to deliver drugs directly to target cells. The nanoparticles enter cells by taking advantage of a route normally used in vesicle-vesicle fusion, a crucial process that allows signal transmission between neurons.

In the July 21 issue of Nature Communications, the researchers describe in detail the mechanism by which these nanoparticles are able to fuse with a membrane. The findings suggest possible strategies for designing nanoparticles — made from gold or other materials — that could get into cells even more easily.

“We’ve identified a type of mechanism that might be more prevalent than is currently known,” says Reid Van Lehn, an MIT graduate student in materials science and engineering and one of the paper’s lead authors. “By identifying this pathway for the first time it also suggests not only how to engineer this particular class of nanoparticles, but that this pathway might be active in other systems as well.”

Most nanoparticles enter cells through endocytosis, a process that traps the particles in intracellular compartments, which can damage the cell membrane and cause cell contents to leak out. But in 2008, MIT researchers found that a special class of gold nanoparticles coated with a mix of molecules could enter cells without any disruption.

Last year, they discovered that the particles were somehow fusing with cell membranes and being absorbed into the cells. In their new study, they created detailed atomistic simulations to model how this happens, and performed experiments that confirmed the model’s predictions.


No comment yet.
Scooped by Dr. Stefan Gruenwald!

LEVAN: Learning Everything about Anything

LEVAN: Learning Everything about Anything | Amazing Science |

Recognition is graduating from labs to real-world applications. While it is encouraging to see its potential being tapped, it brings forth a fundamental challenge to the vision researcher: scalability. How can we learn a model for any concept that exhaustively covers all its appearance variations, while requiring minimal or no human supervision for compiling the vocabulary of visual variance, gathering the training images and annotations, and learning the models?

In this work, LEVAN developers introduce a fully-automated approach for learning extensive models for a wide range of variations (e.g. actions, interactions, attributes and beyond) within any concept. Their approach leverages vast resources of online books to discover the vocabulary of variance, and intertwines the data collection and modeling steps to alleviate the need for explicit human supervision in training the models. Their approach organizes the visual knowledge about a concept in a convenient and useful way, enabling a variety of applications across vision and NLP. A comprehensive aggregation of online system has been queried by users to learn models for several interesting concepts including, breakfast, Gandhi, beautiful, etc. To date, the LEVAN system has models available for over 50,000 variations within 150 concepts, and has annotated more than 10 million images with bounding boxes.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Study suggests probiotics could prevent obesity and insulin resistance

Study suggests probiotics could prevent obesity and insulin resistance | Amazing Science |

Vanderbilt University researchers have discovered that engineered probiotic bacteria (“friendly” bacteria like those in yogurt) in the gut produce a therapeutic compound that inhibits weight gain, insulin resistance, and other adverse effects of a high-fat diet in mice.

“Of course it’s hard to speculate from mouse to human,” said senior investigator Sean Davies, Ph.D., assistant professor of Pharmacology. “But essentially, we’ve prevented most of the negative consequences of obesity in mice, even though they’re eating a high-fat diet.”

The findings published in the August issue of the Journal of Clinical Investigation (open access) suggest that it may be possible to manipulate the bacterial residents of the gut — the gut microbiota — to treat obesity and other chronic diseases.

Davies has a long-standing interest in using probiotic bacteria to deliver drugs to the gut in a sustained manner, in order to eliminate the daily drug regimens associated with chronic diseases. In 2007, he received a National Institutes of Health Director’s New Innovator Award to develop and test the idea.

Other studies have demonstrated that the natural gut microbiota plays a role in obesity, diabetes and cardiovascular disease. “The types of bacteria you have in your gut influence your risk for chronic diseases,” Davies said. “We wondered if we could manipulate the gut microbiota in a way that would promote health.”

To start, the team needed a safe bacterial strain that colonizes the human gut. They selected E. coli Nissle 1917, which has been used as a probiotic treatment for diarrhea since its discovery nearly 100 years ago.

They genetically modified the E. coli Nissle strain to produce a lipid compound called N-acyl phosphatidylethanolamine (NAPE)*, which is normally synthesized in the small intestine in response to feeding. NAPE is rapidly converted to NAE, a compound that reduces both food intake and weight gain. Some evidence suggests that NAPE production may be reduced in individuals eating a high-fat diet.

“NAPE seemed like a great compound to try — since it’s something that the host normally produces,” Davies said.

The investigators added the NAPE-producing bacteria to the drinking water of mice eating a high-fat diet for eight weeks. Mice that received the modified bacteria had dramatically lower food intake, body fat, insulin resistance and fatty liver compared to mice receiving control bacteria.

They found that these protective effects persisted for at least four weeks after the NAPE-producing bacteria were removed from the drinking water. And even 12 weeks after the modified bacteria were removed, the treated mice still had much lower body weight and body fat compared to the control mice. Active bacteria no longer persisted after about six weeks.

Deborah Verran's comment, July 26, 7:31 AM
NB This research was performed in mice. The value of probiotics as for eg in some manufactured brands of yoghurt remains to be seen.
Eric Chan Wei Chiang's curator insight, July 27, 4:39 AM

The term biofortification is often applied to the nutritional enhancement of crops via selective breeding or genetic modification. I felt that term was suitable for describing the genetic enhancement of probiotics as these bacteria confer nutritional benefits and are often incorporated into functional foods.


I also find this technology fascinating because it much simpler and than other comparable therapies such as a bionic pancrease


Functional foods are another topic which interest me and more scoops on the topic can be read here:


Pierre-André Marechal's curator insight, Today, 9:23 AM

A vos commentaires...  PAM

Scooped by Dr. Stefan Gruenwald!

Extremely precise localization of sound origin

Extremely precise localization of sound origin | Amazing Science |

The parasitoid fly Ormia ochracea has the remarkable ability to locate crickets using audible sound. This ability is, in fact, remarkable as the fly's hearing mechanism spans only 1.5 mm which is 50× smaller than the wavelength of sound emitted by the cricket.

The hearing mechanism is, for all practical purposes, a point in space with no significant interaural time or level differences to draw from.

It has been discovered that evolution has empowered the fly with a hearing mechanism that utilizes multiple vibration modes to amplify interaural time and level differences. A team of scientist engineers now presents a fully integrated, man-made mimic of the Ormia's hearing mechanism capable of replicating the remarkable sound localization ability of the special fly.

A silicon-micromachined prototype is presented which uses multiple piezoelectric sensing ports to simultaneously transduce two orthogonal vibration modes of the sensing structure, thereby enabling simultaneous measurement of sound pressure and pressure gradient.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Biologist warn of early stages of Earth's sixth mass extinction event

Biologist warn of early stages of Earth's sixth mass extinction event | Amazing Science |
The planet's current biodiversity, the product of 3.5 billion years of evolutionary trial and error, is the highest in the history of life. But it may be reaching a tipping point. Scientists caution that the loss and decline of animals is contributing to what appears to be the early days of the planet's sixth mass biological extinction event. Since 1500, more than 320 terrestrial vertebrates have become extinct. Populations of the remaining species show a 25 percent average decline in abundance. The situation is similarly dire for invertebrate animal life.

And while previous extinctions have been driven by natural planetary transformations or catastrophic asteroid strikes, the current die-off can be associated to human activity, a situation that the lead author Rodolfo Dirzo, a professor of biology at Stanford, designates an era of "Anthropocene defaunation."

Across vertebrates, 16 to 33 percent of all species are estimated to be globally threatened or endangered. Large animals -- described as megafauna and including elephants, rhinoceroses, polar bears and countless other species worldwide -- face the highest rate of decline, a trend that matches previous extinction events.

Larger animals tend to have lower population growth rates and produce fewer offspring. They need larger habitat areas to maintain viable populations. Their size and meat mass make them easier and more attractive hunting targets for humans.

Although these species represent a relatively low percentage of the animals at risk, their loss would have trickle-down effects that could shake the stability of other species and, in some cases, even human health.

For instance, previous experiments conducted in Kenya have isolated patches of land from megafauna such as zebras, giraffes and elephants, and observed how an ecosystem reacts to the removal of its largest species. Rather quickly, these areas become overwhelmed with rodents. Grass and shrubs increase and the rate of soil compaction decreases. Seeds and shelter become more easily available, and the risk of predation drops.

Consequently, the number of rodents doubles -- and so does the abundance of the disease-carrying ectoparasites that they harbor.

"Where human density is high, you get high rates of defaunation, high incidence of rodents, and thus high levels of pathogens, which increases the risks of disease transmission," said Dirzo, who is also a senior fellow at the Stanford Woods Institute for the Environment. "Who would have thought that just defaunation would have all these dramatic consequences? But it can be a vicious circle."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

WIRED: Have a Drone? Check This Map Before You Fly It

WIRED: Have a Drone? Check This Map Before You Fly It | Amazing Science |

The popularity of drones is climbing quickly among companies, governments and citizens alike. But the rules surrounding where, when and why you can fly an unmanned aerial vehicle aren’t very clear. The FAA has tried to assert control and insist on licensing for all drone operators, while drone pilots and some legal experts claim drones do not fall under the FAA’s purview. The uncertainty—and recent attempts by the FAA to fine a drone pilot and ground a search and rescue organization—has UAV operators nervous.

To help with the question of where it is legal to fly a drone, Mapbox has put together an interactive map of all the no-fly zones for UAVs they could find. Most of the red zones on the map are near airports, military sites and national parks. But as WIRED’s former Editor-in-Chief, Chris Anderson, now CEO of 3-D Robotics and founder of DIY Drones, discovered in 2007 when he crashed a drone bearing a camera into a tree on the grounds of Lawrence Berkeley National Laboratory, there is plenty of trouble in all sorts of places for drone operators to get into.

As one of the map’s authors, Bobby Sudekum, writes on the Mapbox blog, it’s a work in progress. They’ve made the data they collected available for anyone to use, and if you know of other no-fly zones that aren’t on the map, you can add that data to a public repository they started on GitHub.

For instance, you’ll see on the map below that there isn’t a no-fly area over Berkeley Lab, which sits in the greyed area in the hills above UC Berkeley. Similarly, there is no zone marked around Lawrence Livermore National Laboratory, one of the country’s two nuclear weapons labs. I have a call into the lab to check on the rules*, but in the meantime, if you have a drone, just know that in 2006, the lab acquired a Gatling gun that has a range of 1 mile and can fire 4,000 rounds a minute.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Smarter than a first-grader? Caledonian crows can perform as well as 7- to 10-year-olds on cause-and-effect water displacement tasks

Smarter than a first-grader? Caledonian crows can perform as well as 7- to 10-year-olds on cause-and-effect water displacement tasks | Amazing Science |
In Aesop's fable about the crow and the pitcher, a thirsty bird happens upon a vessel of water, but when he tries to drink from it, he finds the water level out of his reach. Not strong enough to knock over the pitcher, the bird drops pebbles into it -- one at a time -- until the water level rises enough for him to drink his fill. New research demonstrates the birds' intellectual prowess may be more fact than fiction.

Highlighting the value of ingenuity, the fable demonstrates that cognitive ability can often be more effective than brute force. It also characterizes crows as pretty resourceful problem solvers. New research conducted by UC Santa Barbara's Corina Logan, with her collaborators at the University of Auckland in New Zealand, demonstrates the birds' intellectual prowess may be more fact than fiction. Her findings appear today in the scientific journal PLOS ONE.

Logan is lead author of the paper, which examines causal cognition using a water displacement paradigm. "We showed that crows can discriminate between different volumes of water and that they can pass a modified test that so far only 7- to 10-year-old children have been able to complete successfully. We provide the strongest evidence so far that the birds attend to cause-and-effect relationships by choosing options that displace more water."

Logan, a junior research fellow at UCSB's SAGE Center for the Study of the Mind, worked with New Caledonian crows in a set of small aviaries in New Caledonia run by the University of Auckland. "We caught the crows in the wild and brought them into the aviaries, where they habituated in about five days," she said. Keeping families together, they housed the birds in separate areas of the aviaries for three to five months before releasing them back to the wild.

The testing room contained an apparatus consisting of two beakers of water, the same height, but one wide and the other narrow. The diameters of the lids were adjusted to be the same on each beaker. "The question is, can they distinguish between water volumes?" Logan said. "Do they understand that dropping a stone into a narrow tube will raise the water level more?" In a previous experiment by Sarah Jelbert and colleagues at the University of Auckland, the birds had not preferred the narrow tube. However, in that study, the crows were given 12 stones to drop in one or the other of the beakers, giving them enough to be successful with either one.

"When we gave them only four objects, they could succeed only in one tube -- the narrower one, because the water level would never get high enough in the wider tube; they were dropping all or most of the objects into the functional tube and getting the food reward," Logan explained. "It wasn't just that they preferred this tube, they appeared to know it was more functional." However, she noted, we still don't know exactly how the crows think when solving this task. They may be imagining the effect of each stone drop before they do it, or they may be using some other cognitive mechanism. "More work is needed," Logan said.

Logan also examined how the crows react to the U-tube task. Here, the crows had to choose between two sets of tubes. With one set, when subjects dropped a stone into a wide tube, the water level raised in an adjacent narrow tube that contained food. This was due to a hidden connection between the two tubes that allowed water to flow. The other set of tubes had no connection, so dropping a stone in the wide tube did not cause the water level to rise in its adjacent narrow tube.

Each set of tubes was marked with a distinct color cue, and test subjects had to notice that dropping a stone into a tube marked with one color resulted in the rise of the floating food in its adjacent small tube. "They have to put the stones into the blue tube or the red one, so all you have to do is learn a really simple rule that red equals food, even if that doesn't make sense because the causal mechanism is hidden," said Logan.

As it turns out, this is a very challenging task for both corvids (a family of birds that includes crows, ravens, jays and rooks) and children. Children ages 7 to 10 were able to learn the rules, as Lucy Cheke and colleagues at the University of Cambridge discovered in 2012. It may have taken a couple of tries to figure out how it worked, Logan noted, but the children consistently put the stones into the correct tube and got the reward (in this case, a token they exchanged for stickers). Children ages 4 to 6, however, were unable to work out the process. "They put the stones randomly into either tube and weren't getting the token consistently," she said.

Recently, Jelbert and colleagues from the University of Auckland put the New Caledonian crows to the test using the same apparatus the children did. The crows failed. So Logan and her team modified the apparatus, expanding the distance between the beakers. And Kitty, a six-month-old juvenile, figured it out. "We don't know how she passed it or what she understands about the task," Logan said, "so we don't know if the same cognitive processes or decisions are happening as with the children, but we now have evidence that they can. It's possible for the birds to pass it.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Where is machine intelligence going? What do super intelligences really want?

Where is machine intelligence going? What do super intelligences really want? | Amazing Science |

Let's face it, humans are pretty intelligent. Most people would not argue with this. We spend a large majority of our lives trying to become MORE intelligent. Some of us spend nearly three decades of our lives in school, learning about the world. We also strive to work together in groups, as nations, and as a species, to better tackle the problems that face us.

Fairly recently in the history of man, we have developed tools, industrial machines, and lately computer systems to help us in our pursuit of this goal. Some particular humans (specifically some transhumanists) believe that their purpose in life is to try and become better than human. In practice this usually means striving to live longer, to become more intelligent, healthier, more aware and more connected with others. The use of technology plays a key role in this ideology.

A second track of transhumanism is to facilitate and support improvement of machines in parallel to improvements in human quality of life. Many people argue that we have also already built complex computer programs which show a glimmer of autonomous intelligence, and that in the future we will be able to create computer programs that are equal to, or have a much greater level of intelligence than humans. Such an intelligent system will be able to self-improve, just as we humans identify gaps in our knowledge and try to fill them by going to school and by learning all we can from others. Our computer programs will soon be able to read Wikipedia and Google Books to learn, just like their creators.

She is also the cofounder of - and organization that works on connectome mapping of the brain and downloading memories.

Even in our deepest theories of machine intelligence, the idea of reward comes up. There is a theoretical model of intelligence called AIXI, developed by Marcus Hutter [3], which is basically a mathematical model which describes a very general, theoretical way in which an intelligent piece of code can work. This model is highly abstract, and allows, for example, all possible combinations of computer program code snippets to be considered in the construction of an intelligent system. Because of this, it hasn’t actually ever been implemented in a real computer. But, also because of this, the model is very general, and captures a description of the most intelligentprogram that could possibly exist. Note that in order to try and build something that even approximates this model is way beyond our computing capability at the moment, but we are talking now about computer systems that may in the future may be much more powerful. Anyway, the interesting thing about this model is that one of the parameters is a term describing… you guessed it… REWARD.

Changing your own code

We, as humans, are clever enough to look at this model, to understand it, and see that there is a reward term in there. And if we can see it, then any computer system that is based on this highly intelligent model will certainly be able to understand this model, and see the reward term too. But – and here’s the catch – the computer system that we build based on this model has the ability to change its own code! In fact it had to in order to become more intelligent than us in the first place, once it realized we were such lousy programmers and took over programming itself!

So imagine a simple example – our case from earlier – where a computer gets an additional ’1′ added to a numerical value for each good thing it does, and it tries to maximize the total by doing more good things. But if the computer program is clever enough, why can’t it just rewrite it’s own code and replace that piece of code that says ‘add 1′ with an ‘add 2′? Now the program gets twice the reward for every good thing that it does! And why stop at 2? Why not 3, or 4? Soon, the program will spend so much time thinking about adjusting its reward number that it will ignore the good task it was doing in the first place!

No comment yet.
Scooped by Dr. Stefan Gruenwald!

How to maintain quantum entanglement in amplified signals?

How to maintain quantum entanglement in amplified signals? | Amazing Science |

Physicists Sergei Filippov (MIPT and Russian Quantum Center at Skolkovo) and Mario Ziman (Masaryk University in Brno, Czech Republic, and the Institute of Physics in Bratislava, Slovakia) have found a way to preserve quantum entanglement of particles passing through an amplifier and, conversely, when transmitting a signal over long distances. Details are provided in an article published in the journal Physical Review A.

The laws of quantum mechanics do not allow for the teleportation of objects and people, but it is already possible to quantum teleport single photons and atoms, which opens up exciting opportunities for the creation of new computing devices and communication lines. Due to specific quantum effects, a quantum computer will be able to efficiently solve certain problems, for example, hacking codes used in banking, but for now it is still just a theoretical possibility. In practice, quantum computing and teleportation are obstructed by a process called decoherence.

Decoherence is the destruction of the quantum state due to the interaction of a quantum system with the outside world. For experiments in quantum computing, scientists use single atoms caught in magnetic traps and cooled to temperatures close to absolute zero. After going through kilometers of fiber, photons cease to be quantum entangled in most cases and become ordinary, unrelated light quanta.

To create an effective quantum computing system, scientists have to solve a number of problems, including preserving quantum entanglement when the signal abates and when it passes through an amplifier. Fiber-optic cables on the ocean bed contain a great deal of special amplifiers composed of optical glass and rare earth elements. It is these amplifiers that make it possible to watch high-resolution videos stored on a server in California from the MIPT campus or a university in Beijing.

In their article, Filippov and Ziman say that a certain class of signals can be transmitted so that the risk ofruining quantum entanglement becomes much lower. In this case, neither the attenuation nor the amplification of a signal ruins the entanglement. To achieve this effect, it is necessary to have the particles in a special, non-Gaussian state, or, as physicists put it, "the wave function of the particles in the coordinate representation should not be in the form of a Gaussian wave packet." A wave function is a basic concept of quantum mechanics, and Gaussian distribution is a major mathematical function used not only by physicists but also by statisticians, sociologists and economists.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Bats use polarized light to calibrate their internal magnetic compass

Bats use polarized light to calibrate their internal magnetic compass | Amazing Science |

Scientists have discovered that greater mouse-eared bats use polarization patterns in the sky to navigate -- the first mammal that's known to do this.

The bats use the way the Sun's light is scattered in the atmosphere at sunset to calibrate their internal magnetic compass, which helps them to fly in the right direction, a study published in Nature Communications has shown.

Despite this breakthrough, researchers have no idea how they manage to detect polarized light. 'We know that other animals use polarization patterns in the sky, and we have at least some idea how they do it: bees have specially-adapted photoreceptors in their eyes, and birds, fish, amphibians and reptiles all have cone cell structures in their eyes which may help them to detect polarization,' says Dr Richard Holland of Queen's University Belfast, co-author of the study.

'But we don't know which structure these bats might be using.' Polarization patterns depend on where the sun is in the sky. They're clearest in a strip across the sky 90° from the position of the sun at sunset or sunrise. But animals can still see the patterns long after sunset. This means they can orient themselves even when they can't see the sun, including when it's cloudy. Scientists have even shown that dung beetles use the polarization pattern of moonlight for orientation.

A hugely diverse range of creatures – including bees, anchovies, birds, reptiles and amphibians – use the patterns as a compass to work out which way is north, south, east and west.

M. Philip Oliver's curator insight, July 23, 8:48 AM

Thanks to Dr. Stefan

Scooped by Dr. Stefan Gruenwald!

BBC's Interactive Infochart: How Planes Crash - Assessing Flight Risk

BBC's Interactive Infochart: How Planes Crash - Assessing Flight Risk | Amazing Science |
The recent cluster of accidents show air travel still has its dangers. BBC Future’s infographic reveals how the latest tragedies compare with previous years.

Questions being asked:

  • Has flying really got safer?
  • What is the chief cause of plane crashes?
  • What was behind the worst ever disasters?
  • Did more crashes occur during take-off or landing?
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Study of starling flight reveals message from turning bird sweeps through flock at constant speed

Study of starling flight reveals message from turning bird sweeps through flock at constant speed | Amazing Science |

A team of researchers with members from several countries working together in Rome, Italy, has come up with a new explanation of how it is that starlings are able to fly in a flock in a way that makes them appear as a single organism. In their paper published in the journal Nature Physics, the team describes how they used high-speed cameras to capture and study flight movement by individual bird members and what they found as a result.

Starling flight is as mesmerizing as it is mystifying—flocks of hundreds or thousands of birds sweep across the sky as if a single organism. The birds flying over Rome in particular have captured the imagination of bird enthusiasts, tourists, film makers and scientists alike. How do individual birds know when to turn and which way? Some have suggested it's a random thing, each bird simply flies making sure not to run into a neighbor. Others have suggested that some birds initiate a turn and others follow, creating a diffusion effect. In this new study, the researchers suggest that none of the earlier theories is correct—they've come up with something brand new.

To get a better look at the birds in flight, the researchers recorded flocks flying over Rome with high speed cameras and then took the results into their lab for examination. They found that turns are almost always initiated by just a few birds, but rather than other birds trying to figure out where to turn too, they instead simply copy how sharply their neighbor turns. This allows for the turn message to propagate through the flock at a very fast constant speed—approximately 20 to 40 meters per second, the team calculated. That constant message transfer speed means that each bird in a flock can respond in as little as half a second, without causing the flock to break apart.

Perhaps even more interesting is that when the researchers applied a spin factor for the turns by the birds, they found that applying it to the flock as a whole allowed for use of the same mathematical equations as physicists use to describe superfluid helium. The researchers believe that's not a coincidence, as there are many examples of physics and math principles that apply to the natural world.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The Motion of the Medium Matters for Self-assembling Particles

The Motion of the Medium Matters for Self-assembling Particles | Amazing Science |

By attaching short sequences of single-stranded DNA to nanoscale building blocks, researchers can design structures that can effectively build themselves. The building blocks that are meant to connect have complementary DNA sequences on their surfaces, ensuring only the correct pieces bind together as they jostle into one another while suspended in a test tube.

Now, a University of Pennsylvania team has made a discovery with implications for all such self-assembled structures.

Earlier work assumed that the liquid medium in which these DNA-coated pieces float could be treated as a placid vacuum, but the Penn team has shown that fluid dynamics play a crucial role in the kind and quality of the structures that can be made in this way.

As the DNA-coated pieces rearrange themselves and bind, they create slipstreams into which other pieces can flow. This phenomenon makes some patterns within the structures more likely to form than others.

The research was conducted by professors Talid Sinno and John Crocker, alongside graduate students Ian Jenkins, Marie Casey and James McGinley, all of the Department of Chemical and Biomolecular Engineering in Penn’s School of Engineering and Applied Science.

It was published in the Proceedings of the National Academy of Sciences.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Study: Climate change and air pollution will combine to curb worldwide food supplies

Study: Climate change and air pollution will combine to curb worldwide food supplies | Amazing Science |
Ozone and higher temperatures can combine to reduce crop yields, but effects will vary by region.

Many studies have shown the potential for global climate change to cut food supplies. But these studies have, for the most part, ignored the interactions between increasing temperature and air pollution — specifically ozone pollution, which is known to damage crops.

A new study involving researchers at MIT shows that these interactions can be quite significant, suggesting that policymakers need to take both warming and air pollution into account in addressing food security.

The study looked in detail at global production of four leading food crops — rice, wheat, corn, and soy — that account for more than half the calories humans consume worldwide. It predicts that effects will vary considerably from region to region, and that some of the crops are much more strongly affected by one or the other of the factors: For example, wheat is very sensitive to ozone exposure, while corn is much more adversely affected by heat.

The research was carried out by Colette Heald, an associate professor of civil and environmental engineering (CEE) at MIT, former CEE postdoc Amos Tai, and Maria van Martin at Colorado State University. Their work is described this week in the journal Nature Climate Change.

Overall, with all other factors being equal, warming may reduce crop yields globally by about 10 percent by 2050, the study found. But the effects of ozone pollution are more complex — some crops are more strongly affected by it than others — which suggests that pollution-control measures could play a major role in determining outcomes. Ozone pollution can also be tricky to identify, Heald says, because its damage can resemble other plant illnesses, producing flecks on leaves and discoloration.

Potential reductions in crop yields are worrisome: The world is expected to need about 50 percent more food by 2050, the authors say, due to population growth and changing dietary trends in the developing world. So any yield reductions come against a backdrop of an overall need to increase production significantly through improved crop selections and farming methods, as well as expansion of farmland.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Mysterious signal from the center of the Perseus Cluster unexplained by known physics

Mysterious signal from the center of the Perseus Cluster unexplained by known physics | Amazing Science |

Astronomers using NASA's Chandra X-ray Observatory to explore the Perseus Cluster, a swarm of galaxies approximately 250 million light years from Earth, have observed the spectral line that appears not to come from any known type of matter.

Perseus Cluster a collection of galaxies and one of the most massive known objects in the Universe, immersed in an enormous 'atmosphere' of superheated plasma. It is approximately 768 000 light years across. "I couldn't believe my eyes," says Esra Bulbul of the Harvard Center for Astrophysics.  "What we found, at first glance, could not be explained by known physics."

"The cluster's atmosphere is full of ions such as Fe XXV,  Si XIV, and S XV.  Each one produces a 'bump' or 'line' in the x-ray spectrum, which we can map using Chandra. These spectral lines are at well-known x-ray energies."

Yet, in 2012 when Bulbul added together 17 day's worth of Chandra data, a new line popped up where no line should be. "A line appeared at 3.56 keV (kilo-electron volts) which does not correspond to any known atomic transition," she says.  "It was a great surprise."

We detected a weak unidentified emission line at E=(3.55-3.57)+/-0.03 keV in a stacked XMM spectrum of 73 galaxy clusters spanning a redshift range 0.01-0.35. MOS and PN observations independently show the presence of the line at consistent energies.

When the full sample is divided into three subsamples (Perseus, Centaurus+Ophiuchus+Coma, and all others), the line is significantly detected in all three independent MOS spectra and the PN "all others" spectrum. It is also detected in the Chandra spectra of Perseus with the flux consistent with XMM (though it is not seen in Virgo). However, it is very weak and located within 50-110eV of several known faint lines, and so is subject to significant modeling uncertainties. On the origin of this line, we argue that there should be no atomic transitions in thermal plasma at this energy. An intriguing possibility is the decay of sterile neutrino, a long-sought dark matter particle candidate.

Assuming that all dark matter is in sterile neutrinos with m_s=2E=7.1 keV, our detection in the full sample corresponds to a neutrino decay mixing angle sin^2(2theta)=7e-11, below the previous upper limits. However, based on the cluster masses and distances, the line in Perseus is much brighter than expected in this model. This appears to be because of an anomalously bright line at E=3.62 keV in Perseus, possibly an Ar XVII dielectronic recombination line, although its flux would be 30 times the expected value and physically difficult to understand. In principle, such an anomaly might explain our line detection in other subsamples as well, though it would stretch the line energy uncertainties. Another alternative is the above anomaly in the Ar line combined with the nearby 3.51 keV K line also exceeding expectation by factor 10-20. Confirmation with Chandra and Suzaku, and eventually Astro-H, are required to determine the nature of this new line.

Russ Roberts's curator insight, July 27, 8:49 PM

Thanks to Dr. Stefan Gruenwald for this fascinating look at a genuine mystery.  Astronomers don't know what they picked up their instruments when observations of the Perseus Cluster were processed.  Is this  unexplained phenomena, something beyond our known physics, or perhaps something akin to Jodie Foster's discovery in the film "Contact?"  In that film, amateur radio provided the background texture of the plot.  Whatever that signal was, it will keep scientists busy for a while.  Astronomers will have to confirm the data "with Chandra and Suzaku and eventually determine the nature of this new line." We are not alone in this universe. Aloha de Russ (KH6JRM).

Scooped by Dr. Stefan Gruenwald!

NASA: Earth escaped a near-miss solar storm in 2012

NASA: Earth escaped a near-miss solar storm in 2012 | Amazing Science |

Back in 2012, the Sun erupted with a powerful solar storm that just missed the Earth but was big enough to "knock modern civilization back to the 18th century," NASA said. The extreme space weather that tore through Earth's orbit on July 23, 2012, was the most powerful in 150 years, according to a statement posted on the US space agency website Wednesday.

However, few Earthlings had any idea what was going on. "If the eruption had occurred only one week earlier, Earth would have been in the line of fire," said Daniel Baker, professor of atmospheric and space physics at the University of Colorado. Instead the storm cloud hit the STEREO-A spacecraft, a solar observatory that is "almost ideally equipped to measure the parameters of such an event," NASA said. Scientists have analyzed the treasure trove of data it collected and concluded that it would have been comparable to the largest known space storm in 1859, known as the Carrington event. It also would have been twice as bad as the 1989 solar storm that knocked out power across Quebec, scientists said.

"I have come away from our recent studies more convinced than ever that Earth and its inhabitants were incredibly fortunate that the 2012 eruption happened when it did," said Baker. The National Academy of Sciences has said the economic impact of a storm like the one in 1859 could cost the modern economy more than two trillion dollars and cause damage that might take years to repair. Experts say solar storms can cause widespread power blackouts, disabling everything from radio to GPS communications to water supplies -- most of which rely on electric pumps.

They begin with an explosion on the Sun's surface, known as a solar flare, sending X-rays and extreme UV radiation toward Earth at light speed. Hours later, energetic particles follow and these electrons and protons can electrify satellites and damage their electronics.

Next are the coronal mass ejections, billion-ton clouds of magnetized plasma that take a day or more to cross the Sun-Earth divide. These are often deflected by Earth's magnetic shield, but a direct hit could be devastating.

Russ Roberts's curator insight, July 25, 8:54 PM

Thanks to Dr. Stefan Gruenwald for this interesting and somewhat scary story of how our modern, digitally connected world could have disappeared on 23 July 2012, but didn't.  On that date, a huge solar flare just missed the Earth.  According to NASA, the flare was "big enough to knock modern society back to the 18th century."  Daniel Baker, a professor of atmospheric and space physics at the University of Colorado, said data retrieved from the sun orbiting spacecraft STEREO-A supported the contention that this super flare was on the same level as the famous 1859 Carrington Event and the much weaker, though still serious, 1989 flare that crippled power distribution in Quebec, Canada.  Baker believes a direct hit from the 23 July 2012 flare would have rendered most solid state electronics, and hence, most of our digital world, inoperative.  Recovery would have cost trillions and modern society would take years to rebuild the damage communications infrastructure.  Such a flare would have "fried" most of our modern amateur radio transceivers, rendering some of us with no communications capability.  This is a cautionary tale for everyone.  It's not a matter of if, but when.  Are you prepared? Aloha de Russ (KH6JRM).

Tekrighter's curator insight, July 26, 7:44 AM

I have touched on this topic before in my blog (Is Technology a Trap for Humanity? - Perhaps it's time for an update.

Scooped by Dr. Stefan Gruenwald!

Western U.S. states using up ground water at an alarming rate

Western U.S. states using up ground water at an alarming rate | Amazing Science |

During intense drought, groundwater depletion in the Colorado River Basin has skyrocketed. For the past 14 years, drought has afflicted the Colorado River Basin, and one of the most visible signs has been the white bathtub rings around the red rocks of Lake Mead and Lake Powell, the two biggest dammed lakes on the river. But there is also an invisible bathtub being emptied, below ground. A new study shows that ground water in the basin is being depleted six times faster than surface water. The groundwater losses, which take thousands of years to be recharged naturally, point to the unsustainability of exploding population centers and water-intensive agriculture in the basin, which includes most of Arizona and parts of Colorado, California, Nevada, Utah, New Mexico, and Wyoming.

The study is the first to identify groundwater depletion across the entire Colorado River Basin, and it brings attention to a neglected issue, says Leonard Konikow, a hydrogeologist emeritus at the U.S. Geological Survey in Reston, Virginia, who was not involved with the work. Because ground water feeds many of the streams and rivers in the area, Konikow predicts that more of them will run dry. He says water pumping costs will rise as farmers—who are the biggest users of ground water—have to drill deeper and deeper into aquifers. “It’s disconcerting,” Konikow says. “Boy, water managers gotta do something about this, because this can’t go on forever.”

To document the groundwater depletion, James Famiglietti, a hydrologist at the University of California, Irvine, and his colleagues relied on a pair of NASA satellites called the Gravity Recovery and Climate Experiment (GRACE). The instruments are sensitive to tiny variations in Earth’s gravity. They can be used to observe groundwater extraction, because when the mass of that water disappears, gravity in that area also drops.

In the 9 years from December 2004 to November 2013, ground water was lost at a rate of 5.6 cubic kilometers a year, the team reports online today in Geophysical Research Letters. That’s compared with a decline of 0.9 cubic kilometers per year from Lake Powell and Lake Mead, which contain 85% of the surface water in the basin.

Famiglietti says it makes sense that cities and farmers turn from surface water to ground water during drought. But he is surprised by the magnitude of the loss. The groundwater depletion rate is twice that in California’s Central Valley, another place famous for heavy groundwater use.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Designing exascale computers and beyond

Designing exascale computers and beyond | Amazing Science |

Harvard's first large-scale digital computer, which came to be known as the Mark I, was conceived by Howard H. Aiken (A.M. '37, Ph.D. '39) and built by IBM. Fifty-one feet long, it was installed in the basement of what is now Lyman Laboratory in 1944, and later moved to a new building called the Aiken Computation Laboratory, where a generation of computing pioneers were educated and where the Maxwell Dworkin building now stands as part of the mechanism remains on exhibit in the Science Center.

The Mark I performed additions and subtractions at a rate of about three per second; multiplication and division took considerably longer. This benchmark was soon surpassed by computers that could do thousands of arithmetic operations per second, then millions and billions. By the late 1990s a few machines were reaching a trillion (1012) operations per second; these were called terascale computers, as tera is the Système International prefix for 1012. The next landmark—and the current state of the art—is the petascale computer, capable of 1015 operations per second. In 2010, Kaxiras' blood flow simulation ran on a petascale computer called Blue Gene/P in Jülich, Germany, which at the time held fifth place on the Top 500 list of supercomputers.

The new goal is an exascale machine, performing at least 1018 operations per second. This is a number so immense it challenges the imagination. Stacks of pennies reaching to the moon are not much help in expressing its magnitude—there would be millions of them. If an exascale computer counted off the age of the universe in units of a billionth of a second, the task would take a little more than 10 seconds.

And what comes after exascale? We can look forward to zettascale (1021) and yottascale (1024); then we run out of prefixes. The engine driving these amazing gains in computer performance is the ability of manufacturers to continually shrink the dimensions of transistors and other microelectronic devices, thereby cramming more of them onto a single chip. (The number of transistors per chip is in the billions now.) Until about 10 years ago, making transistors smaller also made them faster, allowing a speedup in the master clock, the metronome-like signal that sets the tempo for all operations in a digital computer. Between 1980 and 2005, clock rates increased by a factor of 1,000, from a few megahertz to a few gigahertz. But the era of ever-increasing clock rates has ended.

The speed limit for modern computers is now set by power consumption. If all other factors are held constant, the electricity needed to run a processor chip goes up as the cube of the clock rate: doubling the speed brings an eightfold increase in power demand. SEAS Dean Cherry A. Murray, the John A. and Elizabeth S. Armstrong Professor of Engineering and Applied Sciences and Professor of Physics, points out that high-performance chips are already at or above the 100-watt level. "Go much beyond that," she says, "and they would melt."

If the chipmakers cannot build faster transistors, however, they can still make them smaller and thus squeeze more onto each chip. Since 2005 the main strategy for boosting performance has been to gang together multiple processor "cores" on each chip. The clock rate remains roughly constant, but the total number of operations per second increases if the separate cores can be put to work simultaneously on different parts of the same task. Large systems are assembled from vast numbers of these multicore processors.

When the Kaxiras group's blood flow study ran on the Blue Gene/P at Jülich, the machine had almost 300,000 cores. The world's largest and fastest computer, as of June 2014, is the Tianhe-2 in Guangzhou, China, with more than 3 million cores. An exascale machine may have hundreds of millions of cores, or possibly as many as a billion.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Fingerprinting the chemical composition of giant exoplanets

Fingerprinting the chemical composition of giant exoplanets | Amazing Science |
A team of Brazilian and American astronomers used CFHT observations of the system 16 Cygni to discover evidence of how giant planets like Jupiter form.

One of the main models to form giant planets is called "core accretion". In this scenario, a rocky core forms first by aggregation of solid particles until it reaches a few Earth masses when it becomes massive enough to accrete a gaseous envelope. For the first time, astronomers have detected evidence of this rocky core, the first step in the formation of a giant planet like our own Jupiter.

The astronomers used the Canada-France-Hawaii Telescope (CFHT) to analyze the starlight of the binary stars 16 Cygni A and 16 Cygni B. The system is a perfect laboratory to study the formation of giant planets because the stars were born together and are therefore very similar, and both resemble the Sun. However, observations during the last decades show that only one of the two stars, 16 Cygni B, hosts a giant planet which is about 2.4 times as massive as Jupiter. By decomposing the light from the two stars into their basic components and looking at the difference between the two stars, the astronomers were able to detect signatures left from the planet formation process on 16 Cygni B.

The fingerprints detected by the astronomers are twofold. First, they found that the star 16 Cygni A is enhanced in all chemical elements relative to 16 Cygni B. This means that 16 Cygni B, the star that hosts a giant planet, is metal deficient. As both stars were born from the same natal cloud, they should have exactly the same chemical composition. However, planets and stars form at about the same time, hence the metals that are missing in 16 Cygni B (relative to 16 Cygni A) were probably removed from its protoplanetary disk to form its giant planet, so that the remaining material that was falling into 16 Cygni B in the final phases of its formation was deficient in those metals.

The second fingerprint is that on top of an overall deficiency of all analyzed elements in 16 Cygni B, this star has a systematic deficiency in the refractory elements such as iron, aluminum, nickel, magnesium, scandium, and silicon. This is a remarkable discovery because the rocky core of a giant planet is expected to be rich in refractory elements. The formation of the rocky core seems to rob refractory material from the proto-planetary disk, so that the star 16 Cygni B ended up with a lower amount of refractories. This deficiency in the refractory elements can be explained by the formation of a rocky core with a mass of about 1.5 – 6 Earth masses, which is similar to the estimate of Jupiter's core.

"Our results show that the formation of giant planets, as well as terrestrial planets like our own Earth, leaves subtle signatures in stellar atmospheres", says Marcelo Tucci Maia (Universidade de São Paulo), the lead author of the paper.

Read more at:

No comment yet.
Scooped by Dr. Stefan Gruenwald!

No Man’s Sky: A Computer Game Forged by Algorithms and Filled With a Diverse Flora and Fauna

No Man’s Sky: A Computer Game Forged by Algorithms and Filled With a Diverse Flora and Fauna | Amazing Science |

No Man’s Sky is a video game quite unlike any other. Sean Murray, one of the creators of the computer game No Man’s Sky, can’t guarantee that the virtual universe is infinite, but he’s certain that, if it isn’t, nobody will ever find out. “If you were to visit one virtual planet every second,” he says, “then our own sun will have died before you’d have seen them all.”

Developed for Sony’s PlayStation 4 by an improbably small team (the original four-person crew has grown only to 10 in recent months) at Hello Games, an independent studio in the south of England, it’s a game that presents a traversable universe in which every rock, flower, tree, creature, and planet has been “procedurally generated” to create a vast and diverse play area.

“We are attempting to do things that haven’t been done before,” says Murray. “No game has made it possible to fly down to a planet, and for it to be planet-sized, and feature life, ecology, lakes, caves, waterfalls, and canyons, then seamlessly fly up through the stratosphere and take to space again. It’s a tremendous challenge.”

Procedural generation, whereby a game’s landscape is generated not by an artist’s pen but an algorithm, is increasingly prevalent in video games. Most famously Minecraft creates a unique world for each of its players, randomly arranging rocks and lakes from a limited palette of bricks whenever someone begins a new game (see “The Secret to a Video Game Phenomenon”). But No Man’s Sky is far more complex and sophisticated. The tens of millions of planets that comprise the universe are all unique. Each is generated when a player discovers it, and is subject to the laws of its respective solar systems and vulnerable to natural erosion. The multitude of creatures that inhabit the universe dynamically breed and genetically mutate as time progresses. This is virtual world building on an unprecedented scale (see video).

This presents numerous technological challenges, not least of which is how to test a universe of such scale during its development – the team is currently using virtual testers—automated bots that wander around taking screenshots which are then sent back to the team for viewing. Additionally, while No Man’s Sky might have an infinite-sized universe, there aren’t an infinite number of players. To avoid the problem of a kind of virtual loneliness, where a player might never encounter another person on his or her travels, the game starts every new player in the same galaxy (albeit on his or her own planet) with a shared initial goal of traveling to its center. Later in the game, players can meet up, fight, trade, mine, and explore. “Ultimately we don’t know whether people will work, congregate, or disperse,” Murray says. “I know players don’t like to be told that we don’t know what will happen, but that’s what is exciting to us: the game is a vast experiment.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Museum workers pronounce dobsonfly found in China, largest aquatic insect

Museum workers pronounce dobsonfly found in China, largest aquatic insect | Amazing Science |

Workers with the Insect Museum of West China, who were recently given several very large dragon-fly looking insects, with long teeth, by locals in a part of Sichuan, have declared it, a giant dobsonfly the largest known aquatic insect in the world alive today. The find displaces the previous record holder, the South American helicopter damselfly, by just two centimeters.

The dobsonfly is common (there are over 220 species of them) in China, India, Africa, South America and some other parts of Asia, but until now, no specimens as large as those recently found in China have been known. The largest specimens in the found group had a wingspan of 21 centimeters, making it large enough to cover the entire face of a human adult. Locals don't have to worry too much about injury from the insects, however, as officials from the museum report that larger males' mandibles are so huge in proportion to their bodies that they are relatively weak—incapable of piercing human skin. They can kick up a stink, however, as they are able to spray an offensive odor when threatened.

Also, despite the fact that they look an awful lot like dragonflies, they are more closely related to fishflies. The long mandibles, though scary looking to humans, are actually used for mating—males use them to show off for females, and to hold them still during copulation. Interestingly, while their large wings (commonly twice their body length) make for great flying, they only make use of them for about a week—the rest of their time alive as adults is spent hiding under rocks or moving around on or under the water. That means that they are rarely seen as adults, which for most people is probably a good thing as the giants found in China would probably present a frightening sight. They are much better known during their long larval stage when they are used as bait by fishermen.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Ultrasound waves can spin a 200 nm wide gold nanomotor rod up to an impressive rotation of 150,000 rpm

Ultrasound waves can spin a 200 nm wide gold nanomotor rod up to an impressive rotation of 150,000 rpm | Amazing Science |

Scientists at the National Institute of Standards and Technology (NIST) have discovered that a gold nanorod submerged in water and exposed to high-frequency ultrasound waves can spin at an incredible speed of 150,000 RPM, about ten times faster than the previous record. The advance could lead to powerful nanomotors with important applications in medicine, high-speed machining, and the mixing of materials.

Take a rod only a few nanometers in size and find a way to make it spin as fast as possible, for as long as possible, and controlling it as precisely as possible. What you get is a nanomotor, a device that could one day be used to power hordes of tiny robots to build complex nanostructured materials or deliver drugs directly from inside a living cell.

Nanomotors have made giant strides in recent years: they've gotten much smaller and more reliable, and we can now also power them in many different ways. Available options include electricity, magnetic fields, blasting them with photons and, more recently, using ultrasound to rotate rods while they're submerged in water, which could prove very useful in a biological environment.

Previous studies have shown that applying a combination of ultrasound and magnetic fields can control both the spin and the forward motion of the nanorods, but nobody could tell just how fast they were spinning. Now, researchers at NIST have found that, despite being submerged in water, the rods are spinning at an impressive 150,000 RMP, which is 10 times faster than any nanoscale object submerged in liquid ever reported.

To clock the motor's speed, the researchers used gold rods which were 2 micrometers long and 300 nanometer wide. The rods were submerged in water and mixed with polystyrene nanoparticles, and positioned just above a speaker-type shaker.

The researchers will now focus on understanding exactly why the motors rotate (which is not yet well understood) and how the vortexes around the rods affects their interactions with each other.

A paper published in the journal ACS Nano describes the advance.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Removing parasitic retroviruses from the genome is a critical step in evolving larger bodies and longer lifespans

Removing parasitic retroviruses from the genome is a critical step in evolving larger bodies and longer lifespans | Amazing Science |
Cancer has left its 'footprint' on our evolution, according to a study which examined how the relics of ancient viruses are preserved in the genomes of 38 mammal species. The team found that as animals increased in size they 'edited out' potentially cancer-causing relics from their genomes so that mice have almost ten times as many ERVs as humans. The findings offer a clue as to why larger animals have a lower incidence of cancer than expected compared to smaller ones, and could help in the search for new anti-viral therapies.

Viral relics are evidence of the ancient battles our genes have fought against infection. Occasionally the retroviruses that infect an animal get incorporated into that animal's genome and sometimes these relics get passed down from generation to generation -- termed 'endogenous retroviruses' (ERVs). Because ERVs may be copied to other parts of the genome they contribute to the risk of cancer-causing mutations.

Now a team from Oxford University, Plymouth University, and the University of Glasgow has identified 27,711 ERVs preserved in the genomes of 38 mammal species, including humans, over the last 10 million years. The team found that as animals increased in size they 'edited out' these potentially cancer-causing relics from their genomes so that mice have almost ten times as many ERVs as humans. The findings offer a clue as to why larger animals have a lower incidence of cancer than expected compared to smaller ones, and could help in the search for new anti-viral therapies.

We set out to find as many of these viral relics as we could in everything from shrews and humans to elephants and dolphins,' said Dr Aris Katzourakis of Oxford University's Department of Zoology, lead author of the report. 'Viral relics are preserved in every cell of an animal: Because larger animals have many more cells they should have more of these endogenous retroviruses (ERVs) -- and so be at greater risk of ERV-induced mutations -- but we've found this isn't the case. In fact larger animals have far fewer ERVs, so they must have found ways to remove them.'

A combination of mathematical modelling and genome research uncovered some striking differences between mammal genomes: mice (c.19 grams) have 3331 ERVs, humans (c.59 kilograms) have 348 ERVs, whilst dolphins (c.281 kilograms) have just 55 ERVs.

'This is the first time that anyone has shown that having a large number of ERVs in your genome must be harmful -- otherwise larger animals wouldn't have evolved ways of limiting their numbers,' said Dr Katzourakis. 'Logically we think this is linked to the increased risk of ERV-based cancer-causing mutations and how mammals have evolved to combat this risk. So when we look at the pattern of ERV distribution across mammals it's like looking at the 'footprint' cancer has left on our evolution.'

Dr Robert Belshaw of Plymouth University Peninsula Schools of Medicine and Dentistry, School of Biomedical and Healthcare Sciences, added: "Cancer is caused by errors occurring in cells as they divide, so bigger animals -- with more cells -- ought to suffer more from cancer. Put simply, the blue whale should not exist. However, larger animals are not more prone to cancer than smaller ones: this is known as Peto's Paradox (named after Sir Richard Peto, the scientist credited with first spotting this). A team of scientists at Oxford, Plymouth and Glasgow Universities had been studying endogenous retroviruses, viruses like HIV but which have become part of their host's genome and which in other animals can cause cancer. Surprisingly, they found that bigger mammals have fewer of these viruses in their genome. This suggests that similar mechanism might be involved in fighting both cancer and the spread of these viruses, and that these are better in bigger animals (like humans) than smaller ones (like laboratory mice)."

No comment yet.