Amazing Science
Follow
335.1K views | +25 today
Scooped by Dr. Stefan Gruenwald
onto Amazing Science
Scoop.it!

Rapid Diagnostic Tests with Cell Phones to Battle Global Diseases

Rapid Diagnostic Tests with Cell Phones to Battle Global Diseases | Amazing Science | Scoop.it
In the fight against emerging public health threats, early diagnosis of infectious diseases is crucial. And in poor and remote areas of the globe where conventional medical tools like microscopes and cytometers are unavailable, rapid diagnostic tests, or RDTs, are helping to make disease screening quicker and simpler. RDTs are generally small strips on which blood or fluid samples are placed. Specific changes in the color of the strip, which usually occur within minutes, indicate the presence of infection. Different tests can be used to detect various diseases, including HIV, malaria, tuberculosis and syphilis.

 

http://tinyurl.com/bklbznb

more...
No comment yet.
Your new post is loading...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

20,000+ FREE Online Science and Technology Lectures from Top Universities

20,000+ FREE Online Science and Technology Lectures from Top Universities | Amazing Science | Scoop.it

NOTE: To subscribe to the RSS feed of Amazing Science, copy http://www.scoop.it/t/amazing-science/rss.xml into the URL field of your browser and click "subscribe".

 

This newsletter is aggregated from over 1450 news sources:

http://www.genautica.com/links/1450_news_sources.html

 

All my Tweets and Scoop.It! posts sorted and searchable:

http://www.genautica.com/tweets/index.html

 

••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••

You can search through all the articles semantically on my

archived twitter feed

••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••

NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen)  and display all the relevant postings SORTED by TOPICS.

 

You can also type your own query:

 

e.g., you are looking for articles involving "dna" as a keyword

 

http://www.scoop.it/t/amazing-science/?q=dna


Or CLICK on the little FUNNEL symbol at the top right of the screen

••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••


MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video 

more...
Casper Pieters's curator insight, March 9, 7:21 PM

Great resources for online learning just about everything.  All you need is will power and self- discipline.

Russ Roberts's curator insight, April 23, 11:37 PM

A very interesting site.  Amazing Science covers many disciplines.  Subscribe to the news letter and be " amazed." Aloha, Russ, KH6JRM. 

Siegfried Holle's curator insight, July 4, 8:45 AM

Your knowledge is your strength and power 

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Mysterious signal from the center of the Perseus Cluster unexplained by known physics

Mysterious signal from the center of the Perseus Cluster unexplained by known physics | Amazing Science | Scoop.it

Astronomers using NASA's Chandra X-ray Observatory to explore the Perseus Cluster, a swarm of galaxies approximately 250 million light years from Earth, have observed the spectral line that appears not to come from any known type of matter.


Perseus Cluster a collection of galaxies and one of the most massive known objects in the Universe, immersed in an enormous 'atmosphere' of superheated plasma. It is approximately 768 000 light years across. "I couldn't believe my eyes," says Esra Bulbul of the Harvard Center for Astrophysics.  "What we found, at first glance, could not be explained by known physics."


"The cluster's atmosphere is full of ions such as Fe XXV,  Si XIV, and S XV.  Each one produces a 'bump' or 'line' in the x-ray spectrum, which we can map using Chandra. These spectral lines are at well-known x-ray energies."


Yet, in 2012 when Bulbul added together 17 day's worth of Chandra data, a new line popped up where no line should be. "A line appeared at 3.56 keV (kilo-electron volts) which does not correspond to any known atomic transition," she says.  "It was a great surprise."


We detected a weak unidentified emission line at E=(3.55-3.57)+/-0.03 keV in a stacked XMM spectrum of 73 galaxy clusters spanning a redshift range 0.01-0.35. MOS and PN observations independently show the presence of the line at consistent energies.


When the full sample is divided into three subsamples (Perseus, Centaurus+Ophiuchus+Coma, and all others), the line is significantly detected in all three independent MOS spectra and the PN "all others" spectrum. It is also detected in the Chandra spectra of Perseus with the flux consistent with XMM (though it is not seen in Virgo). However, it is very weak and located within 50-110eV of several known faint lines, and so is subject to significant modeling uncertainties. On the origin of this line, we argue that there should be no atomic transitions in thermal plasma at this energy. An intriguing possibility is the decay of sterile neutrino, a long-sought dark matter particle candidate.


Assuming that all dark matter is in sterile neutrinos with m_s=2E=7.1 keV, our detection in the full sample corresponds to a neutrino decay mixing angle sin^2(2theta)=7e-11, below the previous upper limits. However, based on the cluster masses and distances, the line in Perseus is much brighter than expected in this model. This appears to be because of an anomalously bright line at E=3.62 keV in Perseus, possibly an Ar XVII dielectronic recombination line, although its flux would be 30 times the expected value and physically difficult to understand. In principle, such an anomaly might explain our line detection in other subsamples as well, though it would stretch the line energy uncertainties. Another alternative is the above anomaly in the Ar line combined with the nearby 3.51 keV K line also exceeding expectation by factor 10-20. Confirmation with Chandra and Suzaku, and eventually Astro-H, are required to determine the nature of this new line.

more...
Russ Roberts's curator insight, July 27, 11:49 PM

Thanks to Dr. Stefan Gruenwald for this fascinating look at a genuine mystery.  Astronomers don't know what they picked up their instruments when observations of the Perseus Cluster were processed.  Is this  unexplained phenomena, something beyond our known physics, or perhaps something akin to Jodie Foster's discovery in the film "Contact?"  In that film, amateur radio provided the background texture of the plot.  Whatever that signal was, it will keep scientists busy for a while.  Astronomers will have to confirm the data "with Chandra and Suzaku and eventually Astro-H...to determine the nature of this new line." We are not alone in this universe. Aloha de Russ (KH6JRM).

Scooped by Dr. Stefan Gruenwald
Scoop.it!

NASA: Earth escaped a near-miss solar storm in 2012

NASA: Earth escaped a near-miss solar storm in 2012 | Amazing Science | Scoop.it

Back in 2012, the Sun erupted with a powerful solar storm that just missed the Earth but was big enough to "knock modern civilization back to the 18th century," NASA said. The extreme space weather that tore through Earth's orbit on July 23, 2012, was the most powerful in 150 years, according to a statement posted on the US space agency website Wednesday.


However, few Earthlings had any idea what was going on. "If the eruption had occurred only one week earlier, Earth would have been in the line of fire," said Daniel Baker, professor of atmospheric and space physics at the University of Colorado. Instead the storm cloud hit the STEREO-A spacecraft, a solar observatory that is "almost ideally equipped to measure the parameters of such an event," NASA said. Scientists have analyzed the treasure trove of data it collected and concluded that it would have been comparable to the largest known space storm in 1859, known as the Carrington event. It also would have been twice as bad as the 1989 solar storm that knocked out power across Quebec, scientists said.


"I have come away from our recent studies more convinced than ever that Earth and its inhabitants were incredibly fortunate that the 2012 eruption happened when it did," said Baker. The National Academy of Sciences has said the economic impact of a storm like the one in 1859 could cost the modern economy more than two trillion dollars and cause damage that might take years to repair. Experts say solar storms can cause widespread power blackouts, disabling everything from radio to GPS communications to water supplies -- most of which rely on electric pumps.


They begin with an explosion on the Sun's surface, known as a solar flare, sending X-rays and extreme UV radiation toward Earth at light speed. Hours later, energetic particles follow and these electrons and protons can electrify satellites and damage their electronics.


Next are the coronal mass ejections, billion-ton clouds of magnetized plasma that take a day or more to cross the Sun-Earth divide. These are often deflected by Earth's magnetic shield, but a direct hit could be devastating.

more...
Russ Roberts's curator insight, July 25, 11:54 PM

Thanks to Dr. Stefan Gruenwald for this interesting and somewhat scary story of how our modern, digitally connected world could have disappeared on 23 July 2012, but didn't.  On that date, a huge solar flare just missed the Earth.  According to NASA, the flare was "big enough to knock modern society back to the 18th century."  Daniel Baker, a professor of atmospheric and space physics at the University of Colorado, said data retrieved from the sun orbiting spacecraft STEREO-A supported the contention that this super flare was on the same level as the famous 1859 Carrington Event and the much weaker, though still serious, 1989 flare that crippled power distribution in Quebec, Canada.  Baker believes a direct hit from the 23 July 2012 flare would have rendered most solid state electronics, and hence, most of our digital world, inoperative.  Recovery would have cost trillions and modern society would take years to rebuild the damage communications infrastructure.  Such a flare would have "fried" most of our modern amateur radio transceivers, rendering some of us with no communications capability.  This is a cautionary tale for everyone.  It's not a matter of if, but when.  Are you prepared? Aloha de Russ (KH6JRM).

Tekrighter's curator insight, July 26, 10:44 AM

I have touched on this topic before in my blog (Is Technology a Trap for Humanity? - http://tekrighter.wordpress.com/page/3/). Perhaps it's time for an update.

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Western U.S. states using up ground water at an alarming rate

Western U.S. states using up ground water at an alarming rate | Amazing Science | Scoop.it

During intense drought, groundwater depletion in the Colorado River Basin has skyrocketed. For the past 14 years, drought has afflicted the Colorado River Basin, and one of the most visible signs has been the white bathtub rings around the red rocks of Lake Mead and Lake Powell, the two biggest dammed lakes on the river. But there is also an invisible bathtub being emptied, below ground. A new study shows that ground water in the basin is being depleted six times faster than surface water. The groundwater losses, which take thousands of years to be recharged naturally, point to the unsustainability of exploding population centers and water-intensive agriculture in the basin, which includes most of Arizona and parts of Colorado, California, Nevada, Utah, New Mexico, and Wyoming.


The study is the first to identify groundwater depletion across the entire Colorado River Basin, and it brings attention to a neglected issue, says Leonard Konikow, a hydrogeologist emeritus at the U.S. Geological Survey in Reston, Virginia, who was not involved with the work. Because ground water feeds many of the streams and rivers in the area, Konikow predicts that more of them will run dry. He says water pumping costs will rise as farmers—who are the biggest users of ground water—have to drill deeper and deeper into aquifers. “It’s disconcerting,” Konikow says. “Boy, water managers gotta do something about this, because this can’t go on forever.”


To document the groundwater depletion, James Famiglietti, a hydrologist at the University of California, Irvine, and his colleagues relied on a pair of NASA satellites called the Gravity Recovery and Climate Experiment (GRACE). The instruments are sensitive to tiny variations in Earth’s gravity. They can be used to observe groundwater extraction, because when the mass of that water disappears, gravity in that area also drops.


In the 9 years from December 2004 to November 2013, ground water was lost at a rate of 5.6 cubic kilometers a year, the team reports online today in Geophysical Research Letters. That’s compared with a decline of 0.9 cubic kilometers per year from Lake Powell and Lake Mead, which contain 85% of the surface water in the basin.


Famiglietti says it makes sense that cities and farmers turn from surface water to ground water during drought. But he is surprised by the magnitude of the loss. The groundwater depletion rate is twice that in California’s Central Valley, another place famous for heavy groundwater use.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Designing exascale computers and beyond

Designing exascale computers and beyond | Amazing Science | Scoop.it

Harvard's first large-scale digital computer, which came to be known as the Mark I, was conceived by Howard H. Aiken (A.M. '37, Ph.D. '39) and built by IBM. Fifty-one feet long, it was installed in the basement of what is now Lyman Laboratory in 1944, and later moved to a new building called the Aiken Computation Laboratory, where a generation of computing pioneers were educated and where the Maxwell Dworkin building now stands as part of the mechanism remains on exhibit in the Science Center.


The Mark I performed additions and subtractions at a rate of about three per second; multiplication and division took considerably longer. This benchmark was soon surpassed by computers that could do thousands of arithmetic operations per second, then millions and billions. By the late 1990s a few machines were reaching a trillion (1012) operations per second; these were called terascale computers, as tera is the Système International prefix for 1012. The next landmark—and the current state of the art—is the petascale computer, capable of 1015 operations per second. In 2010, Kaxiras' blood flow simulation ran on a petascale computer called Blue Gene/P in Jülich, Germany, which at the time held fifth place on the Top 500 list of supercomputers.


The new goal is an exascale machine, performing at least 1018 operations per second. This is a number so immense it challenges the imagination. Stacks of pennies reaching to the moon are not much help in expressing its magnitude—there would be millions of them. If an exascale computer counted off the age of the universe in units of a billionth of a second, the task would take a little more than 10 seconds.


And what comes after exascale? We can look forward to zettascale (1021) and yottascale (1024); then we run out of prefixes. The engine driving these amazing gains in computer performance is the ability of manufacturers to continually shrink the dimensions of transistors and other microelectronic devices, thereby cramming more of them onto a single chip. (The number of transistors per chip is in the billions now.) Until about 10 years ago, making transistors smaller also made them faster, allowing a speedup in the master clock, the metronome-like signal that sets the tempo for all operations in a digital computer. Between 1980 and 2005, clock rates increased by a factor of 1,000, from a few megahertz to a few gigahertz. But the era of ever-increasing clock rates has ended.


The speed limit for modern computers is now set by power consumption. If all other factors are held constant, the electricity needed to run a processor chip goes up as the cube of the clock rate: doubling the speed brings an eightfold increase in power demand. SEAS Dean Cherry A. Murray, the John A. and Elizabeth S. Armstrong Professor of Engineering and Applied Sciences and Professor of Physics, points out that high-performance chips are already at or above the 100-watt level. "Go much beyond that," she says, "and they would melt."


If the chipmakers cannot build faster transistors, however, they can still make them smaller and thus squeeze more onto each chip. Since 2005 the main strategy for boosting performance has been to gang together multiple processor "cores" on each chip. The clock rate remains roughly constant, but the total number of operations per second increases if the separate cores can be put to work simultaneously on different parts of the same task. Large systems are assembled from vast numbers of these multicore processors.


When the Kaxiras group's blood flow study ran on the Blue Gene/P at Jülich, the machine had almost 300,000 cores. The world's largest and fastest computer, as of June 2014, is the Tianhe-2 in Guangzhou, China, with more than 3 million cores. An exascale machine may have hundreds of millions of cores, or possibly as many as a billion.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Fingerprinting the chemical composition of giant exoplanets

Fingerprinting the chemical composition of giant exoplanets | Amazing Science | Scoop.it
A team of Brazilian and American astronomers used CFHT observations of the system 16 Cygni to discover evidence of how giant planets like Jupiter form.


One of the main models to form giant planets is called "core accretion". In this scenario, a rocky core forms first by aggregation of solid particles until it reaches a few Earth masses when it becomes massive enough to accrete a gaseous envelope. For the first time, astronomers have detected evidence of this rocky core, the first step in the formation of a giant planet like our own Jupiter.


The astronomers used the Canada-France-Hawaii Telescope (CFHT) to analyze the starlight of the binary stars 16 Cygni A and 16 Cygni B. The system is a perfect laboratory to study the formation of giant planets because the stars were born together and are therefore very similar, and both resemble the Sun. However, observations during the last decades show that only one of the two stars, 16 Cygni B, hosts a giant planet which is about 2.4 times as massive as Jupiter. By decomposing the light from the two stars into their basic components and looking at the difference between the two stars, the astronomers were able to detect signatures left from the planet formation process on 16 Cygni B.


The fingerprints detected by the astronomers are twofold. First, they found that the star 16 Cygni A is enhanced in all chemical elements relative to 16 Cygni B. This means that 16 Cygni B, the star that hosts a giant planet, is metal deficient. As both stars were born from the same natal cloud, they should have exactly the same chemical composition. However, planets and stars form at about the same time, hence the metals that are missing in 16 Cygni B (relative to 16 Cygni A) were probably removed from its protoplanetary disk to form its giant planet, so that the remaining material that was falling into 16 Cygni B in the final phases of its formation was deficient in those metals.


The second fingerprint is that on top of an overall deficiency of all analyzed elements in 16 Cygni B, this star has a systematic deficiency in the refractory elements such as iron, aluminum, nickel, magnesium, scandium, and silicon. This is a remarkable discovery because the rocky core of a giant planet is expected to be rich in refractory elements. The formation of the rocky core seems to rob refractory material from the proto-planetary disk, so that the star 16 Cygni B ended up with a lower amount of refractories. This deficiency in the refractory elements can be explained by the formation of a rocky core with a mass of about 1.5 – 6 Earth masses, which is similar to the estimate of Jupiter's core.


"Our results show that the formation of giant planets, as well as terrestrial planets like our own Earth, leaves subtle signatures in stellar atmospheres", says Marcelo Tucci Maia (Universidade de São Paulo), the lead author of the paper.



Read more at: http://phys.org/news/2014-07-fingerprinting-formation-giant-planets.html#jCp

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

No Man’s Sky: A Computer Game Forged by Algorithms and Filled With a Diverse Flora and Fauna

No Man’s Sky: A Computer Game Forged by Algorithms and Filled With a Diverse Flora and Fauna | Amazing Science | Scoop.it

No Man’s Sky is a video game quite unlike any other. Sean Murray, one of the creators of the computer game No Man’s Sky, can’t guarantee that the virtual universe is infinite, but he’s certain that, if it isn’t, nobody will ever find out. “If you were to visit one virtual planet every second,” he says, “then our own sun will have died before you’d have seen them all.”


Developed for Sony’s PlayStation 4 by an improbably small team (the original four-person crew has grown only to 10 in recent months) at Hello Games, an independent studio in the south of England, it’s a game that presents a traversable universe in which every rock, flower, tree, creature, and planet has been “procedurally generated” to create a vast and diverse play area.


“We are attempting to do things that haven’t been done before,” says Murray. “No game has made it possible to fly down to a planet, and for it to be planet-sized, and feature life, ecology, lakes, caves, waterfalls, and canyons, then seamlessly fly up through the stratosphere and take to space again. It’s a tremendous challenge.”


Procedural generation, whereby a game’s landscape is generated not by an artist’s pen but an algorithm, is increasingly prevalent in video games. Most famously Minecraft creates a unique world for each of its players, randomly arranging rocks and lakes from a limited palette of bricks whenever someone begins a new game (see “The Secret to a Video Game Phenomenon”). But No Man’s Sky is far more complex and sophisticated. The tens of millions of planets that comprise the universe are all unique. Each is generated when a player discovers it, and is subject to the laws of its respective solar systems and vulnerable to natural erosion. The multitude of creatures that inhabit the universe dynamically breed and genetically mutate as time progresses. This is virtual world building on an unprecedented scale (see video).


This presents numerous technological challenges, not least of which is how to test a universe of such scale during its development – the team is currently using virtual testers—automated bots that wander around taking screenshots which are then sent back to the team for viewing. Additionally, while No Man’s Sky might have an infinite-sized universe, there aren’t an infinite number of players. To avoid the problem of a kind of virtual loneliness, where a player might never encounter another person on his or her travels, the game starts every new player in the same galaxy (albeit on his or her own planet) with a shared initial goal of traveling to its center. Later in the game, players can meet up, fight, trade, mine, and explore. “Ultimately we don’t know whether people will work, congregate, or disperse,” Murray says. “I know players don’t like to be told that we don’t know what will happen, but that’s what is exciting to us: the game is a vast experiment.”

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Museum workers pronounce dobsonfly found in China, largest aquatic insect

Museum workers pronounce dobsonfly found in China, largest aquatic insect | Amazing Science | Scoop.it

Workers with the Insect Museum of West China, who were recently given several very large dragon-fly looking insects, with long teeth, by locals in a part of Sichuan, have declared it, a giant dobsonfly the largest known aquatic insect in the world alive today. The find displaces the previous record holder, the South American helicopter damselfly, by just two centimeters.

The dobsonfly is common (there are over 220 species of them) in China, India, Africa, South America and some other parts of Asia, but until now, no specimens as large as those recently found in China have been known. The largest specimens in the found group had a wingspan of 21 centimeters, making it large enough to cover the entire face of a human adult. Locals don't have to worry too much about injury from the insects, however, as officials from the museum report that larger males' mandibles are so huge in proportion to their bodies that they are relatively weak—incapable of piercing human skin. They can kick up a stink, however, as they are able to spray an offensive odor when threatened.


Also, despite the fact that they look an awful lot like dragonflies, they are more closely related to fishflies. The long mandibles, though scary looking to humans, are actually used for mating—males use them to show off for females, and to hold them still during copulation. Interestingly, while their large wings (commonly twice their body length) make for great flying, they only make use of them for about a week—the rest of their time alive as adults is spent hiding under rocks or moving around on or under the water. That means that they are rarely seen as adults, which for most people is probably a good thing as the giants found in China would probably present a frightening sight. They are much better known during their long larval stage when they are used as bait by fishermen.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Ultrasound waves can spin a 200 nm wide gold nanomotor rod up to an impressive rotation of 150,000 rpm

Ultrasound waves can spin a 200 nm wide gold nanomotor rod up to an impressive rotation of 150,000 rpm | Amazing Science | Scoop.it

Scientists at the National Institute of Standards and Technology (NIST) have discovered that a gold nanorod submerged in water and exposed to high-frequency ultrasound waves can spin at an incredible speed of 150,000 RPM, about ten times faster than the previous record. The advance could lead to powerful nanomotors with important applications in medicine, high-speed machining, and the mixing of materials.


Take a rod only a few nanometers in size and find a way to make it spin as fast as possible, for as long as possible, and controlling it as precisely as possible. What you get is a nanomotor, a device that could one day be used to power hordes of tiny robots to build complex nanostructured materials or deliver drugs directly from inside a living cell.


Nanomotors have made giant strides in recent years: they've gotten much smaller and more reliable, and we can now also power them in many different ways. Available options include electricity, magnetic fields,blasting them with photons and, more recently, using ultrasound to rotate rods while they're submerged in water, which could prove very useful in a biological environment.


Previous studies have shown that applying a combination of ultrasound and magnetic fields can control both the spin and the forward motion of the nanorods, but nobody could tell just how fast they were spinning. Now, researchers at NIST have found that, despite being submerged in water, the rods are spinning at an impressive 150,000 RMP, which is 10 times faster than any nanoscale object submerged in liquid ever reported.


To clock the motor's speed, the researchers used gold rods which were 2 micrometers long and 300 nanometer wide. The rods were submerged in water and mixed with polystyrene nanoparticles, and positioned just above a speaker-type shaker.

The researchers will now focus on understanding exactly why the motors rotate (which is not yet well understood) and how the vortexes around the rods affects their interactions with each other.


A paper published in the journal ACS Nano describes the advance.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Removing parasitic retroviruses from the genome is a critical step in evolving larger bodies and longer lifespans

Removing parasitic retroviruses from the genome is a critical step in evolving larger bodies and longer lifespans | Amazing Science | Scoop.it
Cancer has left its 'footprint' on our evolution, according to a study which examined how the relics of ancient viruses are preserved in the genomes of 38 mammal species. The team found that as animals increased in size they 'edited out' potentially cancer-causing relics from their genomes so that mice have almost ten times as many ERVs as humans. The findings offer a clue as to why larger animals have a lower incidence of cancer than expected compared to smaller ones, and could help in the search for new anti-viral therapies.


Viral relics are evidence of the ancient battles our genes have fought against infection. Occasionally the retroviruses that infect an animal get incorporated into that animal's genome and sometimes these relics get passed down from generation to generation -- termed 'endogenous retroviruses' (ERVs). Because ERVs may be copied to other parts of the genome they contribute to the risk of cancer-causing mutations.


Now a team from Oxford University, Plymouth University, and the University of Glasgow has identified 27,711 ERVs preserved in the genomes of 38 mammal species, including humans, over the last 10 million years. The team found that as animals increased in size they 'edited out' these potentially cancer-causing relics from their genomes so that mice have almost ten times as many ERVs as humans. The findings offer a clue as to why larger animals have a lower incidence of cancer than expected compared to smaller ones, and could help in the search for new anti-viral therapies.


We set out to find as many of these viral relics as we could in everything from shrews and humans to elephants and dolphins,' said Dr Aris Katzourakis of Oxford University's Department of Zoology, lead author of the report. 'Viral relics are preserved in every cell of an animal: Because larger animals have many more cells they should have more of these endogenous retroviruses (ERVs) -- and so be at greater risk of ERV-induced mutations -- but we've found this isn't the case. In fact larger animals have far fewer ERVs, so they must have found ways to remove them.'


A combination of mathematical modelling and genome research uncovered some striking differences between mammal genomes: mice (c.19 grams) have 3331 ERVs, humans (c.59 kilograms) have 348 ERVs, whilst dolphins (c.281 kilograms) have just 55 ERVs.


'This is the first time that anyone has shown that having a large number of ERVs in your genome must be harmful -- otherwise larger animals wouldn't have evolved ways of limiting their numbers,' said Dr Katzourakis. 'Logically we think this is linked to the increased risk of ERV-based cancer-causing mutations and how mammals have evolved to combat this risk. So when we look at the pattern of ERV distribution across mammals it's like looking at the 'footprint' cancer has left on our evolution.'


Dr Robert Belshaw of Plymouth University Peninsula Schools of Medicine and Dentistry, School of Biomedical and Healthcare Sciences, added: "Cancer is caused by errors occurring in cells as they divide, so bigger animals -- with more cells -- ought to suffer more from cancer. Put simply, the blue whale should not exist. However, larger animals are not more prone to cancer than smaller ones: this is known as Peto's Paradox (named after Sir Richard Peto, the scientist credited with first spotting this). A team of scientists at Oxford, Plymouth and Glasgow Universities had been studying endogenous retroviruses, viruses like HIV but which have become part of their host's genome and which in other animals can cause cancer. Surprisingly, they found that bigger mammals have fewer of these viruses in their genome. This suggests that similar mechanism might be involved in fighting both cancer and the spread of these viruses, and that these are better in bigger animals (like humans) than smaller ones (like laboratory mice)."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Elephants possess a sense of smell that is likely the strongest ever identified in a single species

Elephants possess a sense of smell that is likely the strongest ever identified in a single species | Amazing Science | Scoop.it

The African elephant's genome contains the largest number of smell receptor genes - nearly 2,000 - say the researchers in the journal Genome Research.


Olfactory receptors detect odors in the environment. That means elephants' sniffers are five times more powerful than people's noses, twice that of dogs, and even stronger than the previous known record-holder in the animal kingdom: rats.


"Apparently, an elephant's nose is not only long but also superior," says lead study author Dr Yoshihito Niimura of the University of Tokyo.


Just how these genes work is not well understood, but they likely helped elephants survive and navigate their environment over the ages.


The ability to smell allows creatures to find mates and food - and avoid predators.


The study compared elephant olfactory receptor genes to those of 13 other animals, including horses, rabbits, guinea pigs, cows, rodents and chimpanzees.


Primates and people actually had very low numbers of olfactory receptor genes compared to other species, the study found.

This could be "a result of our diminished reliance on smell as our visual acuity improved," sats Niimura.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Powerful new sensor identfies molecules containing fewer than 20 atoms

Powerful new sensor identfies molecules containing fewer than 20 atoms | Amazing Science | Scoop.it

Researchers at Rice University’s Laboratory for Nanophotonics (LANP) have created a unique sensor that amplifies the optical signature of molecules by about 100 billion times — accurately identifying the composition and structure of individual molecules containing fewer than 20 atoms.


The new single-molecule imaging method, described  in the journal Nature Communications, uses a form of Raman spectroscopy in combination with optical amplifier, making the sensor about 10 times more powerful that previously reported devices, said LANP Director Naomi Halas, the lead scientist on the study.


“The ideal single-molecule sensor would be able to identify an unknown molecule — even a very small one — without any prior information about that molecule’s structure or composition. That’s not possible with current technology, but this new technique has that potential.”


The optical sensor uses Raman spectroscopy, a technique pioneered in the 1930s that blossomed after the advent of lasers in the 1960s. When light strikes a molecule, most of its photons bounce off or pass directly through, but a tiny fraction — fewer than one in a trillion — are absorbed and re-emitted into another energy level that differs from their initial level. By measuring and analyzing these re-emitted photons through Raman spectroscopy, scientists can decipher the types of atoms in a molecule as well as their structural arrangement.


Scientists have created a number of techniques to boost Raman signals. In the new study, LANP graduate student Yu Zhang used one of these, a two-coherent-laser technique called “coherent anti-Stokes Raman spectroscopy,” or CARS. By using CARS in conjunction with a light amplifier made of four tiny gold nanodiscs, Halas and Zhang were able to measure single molecules in a powerful new way. LANP has dubbed the new technique “surface-enhanced CARS,” or SECARS.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Noninvasive retinal imaging device detects Alzheimer’s up to 20 years in advance

Noninvasive retinal imaging device detects Alzheimer’s up to 20 years in advance | Amazing Science | Scoop.it

Cedars-SinaI Medical Center researchers have developed a noninvasive retinal imaging device that can provide early detection of changes indicating Alzheimer’s disease 15 to 20 years before clinical diagnosis.


“In preliminary results in 40 patients, the test could differentiate between Alzheimer’s disease and non-Alzheimer’s disease with 100 percent sensitivity and 80.6 percent specificity, meaning that all people with the disease tested positive and most of the people without the disease tested negative,” said Shaun Frost, a biomedical scientist and the study manager at the Commonwealth Scientific and Industrial Research Organisation (CSIRO), Australia’s national science agency.


Keith Black, MD, professor and chair of Cedars-Sinai’s Department of Neurosurgery and director of the Maxine Dunitz Neurosurgical Institute and the Ruth and Lawrence Harvey Chair in Neuroscience, said the accumulation of beta-amyloid plaque in the brain is a hallmark sign of Alzheimer’s, but current tests detect changes only after the disease has advanced to late stages.


Researchers believe that as treatment options improve, early detection will be critical, but existing diagnostic methods are inconvenient, costly and impractical for routine screening.


“PET scans require the use of radioactive tracers, and cerebrospinal fluid analysis requires that patients undergo invasive and often painful lumbar punctures, but neither approach is quite feasible, especially for patients in the earlier stages of disease,” he said. Positron emission tomography, or PET, is the current diagnostic standard.


“The retina, unlike other structures of the eye, is part of the central nervous system, sharing many characteristics of the brain. A few years ago, we discovered at Cedars-Sinai that the plaques associated with Alzheimer’s disease occur not only in the brain but also in the retina.


more...
Krishan Maggon 's curator insight, July 22, 5:43 PM

The test has to be validated in a large number of AD patients at various stages of AD with an elderly control group with normal cognition function.

Scooped by Dr. Stefan Gruenwald
Scoop.it!

LEVAN: Learning Everything about Anything

LEVAN: Learning Everything about Anything | Amazing Science | Scoop.it

Recognition is graduating from labs to real-world applications. While it is encouraging to see its potential being tapped, it brings forth a fundamental challenge to the vision researcher: scalability. How can we learn a model for any concept that exhaustively covers all its appearance variations, while requiring minimal or no human supervision for compiling the vocabulary of visual variance, gathering the training images and annotations, and learning the models?


In this work, LEVAN developers introduce a fully-automated approach for learning extensive models for a wide range of variations (e.g. actions, interactions, attributes and beyond) within any concept. Their approach leverages vast resources of online books to discover the vocabulary of variance, and intertwines the data collection and modeling steps to alleviate the need for explicit human supervision in training the models. Their approach organizes the visual knowledge about a concept in a convenient and useful way, enabling a variety of applications across vision and NLP. A comprehensive aggregation of online system has been queried by users to learn models for several interesting concepts including, breakfast, Gandhi, beautiful, etc. To date, the LEVAN system has models available for over 50,000 variations within 150 concepts, and has annotated more than 10 million images with bounding boxes.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Study suggests probiotics could prevent obesity and insulin resistance

Study suggests probiotics could prevent obesity and insulin resistance | Amazing Science | Scoop.it

Vanderbilt University researchers have discovered that engineered probiotic bacteria (“friendly” bacteria like those in yogurt) in the gut produce a therapeutic compound that inhibits weight gain, insulin resistance, and other adverse effects of a high-fat diet in mice.


“Of course it’s hard to speculate from mouse to human,” said senior investigator Sean Davies, Ph.D., assistant professor of Pharmacology. “But essentially, we’ve prevented most of the negative consequences of obesity in mice, even though they’re eating a high-fat diet.”


The findings published in the August issue of the Journal of Clinical Investigation (open access) suggest that it may be possible to manipulate the bacterial residents of the gut — the gut microbiota — to treat obesity and other chronic diseases.


Davies has a long-standing interest in using probiotic bacteria to deliver drugs to the gut in a sustained manner, in order to eliminate the daily drug regimens associated with chronic diseases. In 2007, he received a National Institutes of Health Director’s New Innovator Award to develop and test the idea.


Other studies have demonstrated that the natural gut microbiota plays a role in obesity, diabetes and cardiovascular disease. “The types of bacteria you have in your gut influence your risk for chronic diseases,” Davies said. “We wondered if we could manipulate the gut microbiota in a way that would promote health.”


To start, the team needed a safe bacterial strain that colonizes the human gut. They selected E. coli Nissle 1917, which has been used as a probiotic treatment for diarrhea since its discovery nearly 100 years ago.


They genetically modified the E. coli Nissle strain to produce a lipid compound called N-acyl phosphatidylethanolamine (NAPE)*, which is normally synthesized in the small intestine in response to feeding. NAPE is rapidly converted to NAE, a compound that reduces both food intake and weight gain. Some evidence suggests that NAPE production may be reduced in individuals eating a high-fat diet.


“NAPE seemed like a great compound to try — since it’s something that the host normally produces,” Davies said.


The investigators added the NAPE-producing bacteria to the drinking water of mice eating a high-fat diet for eight weeks. Mice that received the modified bacteria had dramatically lower food intake, body fat, insulin resistance and fatty liver compared to mice receiving control bacteria.


They found that these protective effects persisted for at least four weeks after the NAPE-producing bacteria were removed from the drinking water. And even 12 weeks after the modified bacteria were removed, the treated mice still had much lower body weight and body fat compared to the control mice. Active bacteria no longer persisted after about six weeks.

more...
Deborah Verran's comment, July 26, 10:31 AM
NB This research was performed in mice. The value of probiotics as for eg in some manufactured brands of yoghurt remains to be seen.
Eric Chan Wei Chiang's curator insight, July 27, 7:39 AM

The term biofortification is often applied to the nutritional enhancement of crops via selective breeding or genetic modification. I felt that term was suitable for describing the genetic enhancement of probiotics as these bacteria confer nutritional benefits and are often incorporated into functional foods.

 

I also find this technology fascinating because it much simpler and than other comparable therapies such as a bionic pancrease http://sco.lt/6W8BuL

 

Functional foods are another topic which interest me and more scoops on the topic can be read here:

http://www.scoop.it/t/food-health-and-nutrition/?tag=Functional+Foods

 

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Extremely precise localization of sound origine

Extremely precise localization of sound origine | Amazing Science | Scoop.it

The parasitoid fly Ormia ochracea has the remarkable ability to locate crickets using audiblesound. This ability is, in fact, remarkable as the fly's hearing mechanism spans only 1.5 mm which is 50× smaller than the wavelength of sound emitted by the cricket. The hearing mechanism is, for all practical purposes, a point in space with no significant interaural time or level differences to draw from.


It has been discovered that evolution has empowered the fly with a hearing mechanism that utilizes multiple vibration modes to amplify interaural time and level differences. A team of scientist engineers now presents a fully integrated, man-made mimic of the Ormia's hearingmechanism capable of replicating the remarkable sound localization ability of the special fly.


A silicon-micromachined prototype is presented which uses multiple piezoelectric sensing ports to simultaneously transduce two orthogonal vibration modes of the sensing structure, thereby enabling simultaneous measurement of sound pressure and pressure gradient.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Biologist warn of early stages of Earth's sixth mass extinction event

Biologist warn of early stages of Earth's sixth mass extinction event | Amazing Science | Scoop.it
The planet's current biodiversity, the product of 3.5 billion years of evolutionary trial and error, is the highest in the history of life. But it may be reaching a tipping point. Scientists caution that the loss and decline of animals is contributing to what appears to be the early days of the planet's sixth mass biological extinction event. Since 1500, more than 320 terrestrial vertebrates have become extinct. Populations of the remaining species show a 25 percent average decline in abundance. The situation is similarly dire for invertebrate animal life.


And while previous extinctions have been driven by natural planetary transformations or catastrophic asteroid strikes, the current die-off can be associated to human activity, a situation that the lead author Rodolfo Dirzo, a professor of biology at Stanford, designates an era of "Anthropocene defaunation."


Across vertebrates, 16 to 33 percent of all species are estimated to be globally threatened or endangered. Large animals -- described as megafauna and including elephants, rhinoceroses, polar bears and countless other species worldwide -- face the highest rate of decline, a trend that matches previous extinction events.


Larger animals tend to have lower population growth rates and produce fewer offspring. They need larger habitat areas to maintain viable populations. Their size and meat mass make them easier and more attractive hunting targets for humans.


Although these species represent a relatively low percentage of the animals at risk, their loss would have trickle-down effects that could shake the stability of other species and, in some cases, even human health.


For instance, previous experiments conducted in Kenya have isolated patches of land from megafauna such as zebras, giraffes and elephants, and observed how an ecosystem reacts to the removal of its largest species. Rather quickly, these areas become overwhelmed with rodents. Grass and shrubs increase and the rate of soil compaction decreases. Seeds and shelter become more easily available, and the risk of predation drops.


Consequently, the number of rodents doubles -- and so does the abundance of the disease-carrying ectoparasites that they harbor.

"Where human density is high, you get high rates of defaunation, high incidence of rodents, and thus high levels of pathogens, which increases the risks of disease transmission," said Dirzo, who is also a senior fellow at the Stanford Woods Institute for the Environment. "Who would have thought that just defaunation would have all these dramatic consequences? But it can be a vicious circle."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

WIRED: Have a Drone? Check This Map Before You Fly It

WIRED: Have a Drone? Check This Map Before You Fly It | Amazing Science | Scoop.it

The popularity of drones is climbing quickly among companies, governments and citizens alike. But the rules surrounding where, when and why you can fly an unmanned aerial vehicle aren’t very clear. The FAA has tried to assert control and insist on licensing for all drone operators, while drone pilots and some legal experts claim drones do not fall under the FAA’s purview. The uncertainty—and recent attempts by the FAA to fine a drone pilot and ground a search and rescue organization—has UAV operators nervous.


To help with the question of where it is legal to fly a drone, Mapbox has put together an interactive map of all the no-fly zones for UAVs they could find. Most of the red zones on the map are near airports, military sites and national parks. But as WIRED’s former Editor-in-Chief, Chris Anderson, now CEO of 3-D Roboticsand founder of DIY Drones, discovered in 2007 when he crashed a drone bearing a camera into a tree on the grounds of Lawrence Berkeley National Laboratory, there is plenty of trouble in all sorts of places for drone operators to get into.


As one of the map’s authors, Bobby Sudekum, writes on the Mapbox blog, it’s a work in progress. They’ve made the data they collected available for anyone to use, and if you know of other no-fly zones that aren’t on the map, you can add that data to a public repository they started on GitHub.


For instance, you’ll see on the map below that there isn’t a no-fly area over Berkeley Lab, which sits in the greyed area in the hills above UC Berkeley. Similarly, there is no zone marked around Lawrence Livermore National Laboratory, one of the country’s two nuclear weapons labs. I have a call into the lab to check on the rules*, but in the meantime, if you have a drone, just know that in 2006, the lab acquired a Gatling gun that has a range of 1 mile and can fire 4,000 rounds a minute.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Smarter than a first-grader? Caledonian crows can perform as well as 7- to 10-year-olds on cause-and-effect water displacement tasks

Smarter than a first-grader? Caledonian crows can perform as well as 7- to 10-year-olds on cause-and-effect water displacement tasks | Amazing Science | Scoop.it
In Aesop's fable about the crow and the pitcher, a thirsty bird happens upon a vessel of water, but when he tries to drink from it, he finds the water level out of his reach. Not strong enough to knock over the pitcher, the bird drops pebbles into it -- one at a time -- until the water level rises enough for him to drink his fill. New research demonstrates the birds' intellectual prowess may be more fact than fiction.


Highlighting the value of ingenuity, the fable demonstrates that cognitive ability can often be more effective than brute force. It also characterizes crows as pretty resourceful problem solvers. New research conducted by UC Santa Barbara's Corina Logan, with her collaborators at the University of Auckland in New Zealand, demonstrates the birds' intellectual prowess may be more fact than fiction. Her findings appear today in the scientific journal PLOS ONE.


Logan is lead author of the paper, which examines causal cognition using a water displacement paradigm. "We showed that crows can discriminate between different volumes of water and that they can pass a modified test that so far only 7- to 10-year-old children have been able to complete successfully. We provide the strongest evidence so far that the birds attend to cause-and-effect relationships by choosing options that displace more water."


Logan, a junior research fellow at UCSB's SAGE Center for the Study of the Mind, worked with New Caledonian crows in a set of small aviaries in New Caledonia run by the University of Auckland. "We caught the crows in the wild and brought them into the aviaries, where they habituated in about five days," she said. Keeping families together, they housed the birds in separate areas of the aviaries for three to five months before releasing them back to the wild.


The testing room contained an apparatus consisting of two beakers of water, the same height, but one wide and the other narrow. The diameters of the lids were adjusted to be the same on each beaker. "The question is, can they distinguish between water volumes?" Logan said. "Do they understand that dropping a stone into a narrow tube will raise the water level more?" In a previous experiment by Sarah Jelbert and colleagues at the University of Auckland, the birds had not preferred the narrow tube. However, in that study, the crows were given 12 stones to drop in one or the other of the beakers, giving them enough to be successful with either one.


"When we gave them only four objects, they could succeed only in one tube -- the narrower one, because the water level would never get high enough in the wider tube; they were dropping all or most of the objects into the functional tube and getting the food reward," Logan explained. "It wasn't just that they preferred this tube, they appeared to know it was more functional." However, she noted, we still don't know exactly how the crows think when solving this task. They may be imagining the effect of each stone drop before they do it, or they may be using some other cognitive mechanism. "More work is needed," Logan said.


Logan also examined how the crows react to the U-tube task. Here, the crows had to choose between two sets of tubes. With one set, when subjects dropped a stone into a wide tube, the water level raised in an adjacent narrow tube that contained food. This was due to a hidden connection between the two tubes that allowed water to flow. The other set of tubes had no connection, so dropping a stone in the wide tube did not cause the water level to rise in its adjacent narrow tube.


Each set of tubes was marked with a distinct color cue, and test subjects had to notice that dropping a stone into a tube marked with one color resulted in the rise of the floating food in its adjacent small tube. "They have to put the stones into the blue tube or the red one, so all you have to do is learn a really simple rule that red equals food, even if that doesn't make sense because the causal mechanism is hidden," said Logan.


As it turns out, this is a very challenging task for both corvids (a family of birds that includes crows, ravens, jays and rooks) and children. Children ages 7 to 10 were able to learn the rules, as Lucy Cheke and colleagues at the University of Cambridge discovered in 2012. It may have taken a couple of tries to figure out how it worked, Logan noted, but the children consistently put the stones into the correct tube and got the reward (in this case, a token they exchanged for stickers). Children ages 4 to 6, however, were unable to work out the process. "They put the stones randomly into either tube and weren't getting the token consistently," she said.


Recently, Jelbert and colleagues from the University of Auckland put the New Caledonian crows to the test using the same apparatus the children did. The crows failed. So Logan and her team modified the apparatus, expanding the distance between the beakers. And Kitty, a six-month-old juvenile, figured it out. "We don't know how she passed it or what she understands about the task," Logan said, "so we don't know if the same cognitive processes or decisions are happening as with the children, but we now have evidence that they can. It's possible for the birds to pass it.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Where is machine intelligence going? What do super intelligences really want?

Where is machine intelligence going? What do super intelligences really want? | Amazing Science | Scoop.it

Let's face it, humans are pretty intelligent. Most people would not argue with this. We spend a large majority of our lives trying to become MORE intelligent. Some of us spend nearly three decades of our lives in school, learning about the world. We also strive to work together in groups, as nations, and as a species, to better tackle the problems that face us.


Fairly recently in the history of man, we have developed tools, industrial machines, and lately computer systems to help us in our pursuit of this goal. Some particular humans (specifically some transhumanists) believe that their purpose in life is to try and become better than human. In practice this usually means striving to live longer, to become more intelligent, healthier, more aware and more connected with others. The use of technology plays a key role in this ideology.


A second track of transhumanism is to facilitate and support improvement of machines in parallel to improvements in human quality of life. Many people argue that we have also already built complex computer programs which show a glimmer of autonomous intelligence, and that in the future we will be able to create computer programs that are equal to, or have a much greater level of intelligence than humans. Such an intelligent system will be able to self-improve, just as we humans identify gaps in our knowledge and try to fill them by going to school and by learning all we can from others. Our computer programs will soon be able to read Wikipedia and Google Books to learn, just like their creators.

She is also the cofounder of carboncopies.org - and organization that works on connectome mapping of the brain and downloading memories.


Even in our deepest theories of machine intelligence, the idea of reward comes up. There is a theoretical model of intelligence called AIXI, developed by Marcus Hutter [3], which is basically a mathematical model which describes a very general, theoretical way in which an intelligent piece of code can work. This model is highly abstract, and allows, for example, all possible combinations of computer program code snippets to be considered in the construction of an intelligent system. Because of this, it hasn’t actually ever been implemented in a real computer. But, also because of this, the model is very general, and captures a description of the most intelligentprogram that could possibly exist. Note that in order to try and build something that even approximates this model is way beyond our computing capability at the moment, but we are talking now about computer systems that may in the future may be much more powerful. Anyway, the interesting thing about this model is that one of the parameters is a term describing… you guessed it… REWARD.


Changing your own code

We, as humans, are clever enough to look at this model, to understand it, and see that there is a reward term in there. And if we can see it, then any computer system that is based on this highly intelligent model will certainly be able to understand this model, and see the reward term too. But – and here’s the catch – the computer system that we build based on this model has the ability to change its own code! In fact it had to in order to become more intelligent than us in the first place, once it realized we were such lousy programmers and took over programming itself!


So imagine a simple example – our case from earlier – where a computer gets an additional ’1′ added to a numerical value for each good thing it does, and it tries to maximize the total by doing more good things. But if the computer program is clever enough, why can’t it just rewrite it’s own code and replace that piece of code that says ‘add 1′ with an ‘add 2′? Now the program gets twice the reward for every good thing that it does! And why stop at 2? Why not 3, or 4? Soon, the program will spend so much time thinking about adjusting its reward number that it will ignore the good task it was doing in the first place!


https://www.youtube.com/watch?v=r8x_ohZJLx0

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

How to maintain quantum entanglement in amplified signals?

How to maintain quantum entanglement in amplified signals? | Amazing Science | Scoop.it

Physicists Sergei Filippov (MIPT and Russian Quantum Center at Skolkovo) and Mario Ziman (Masaryk University in Brno, Czech Republic, and the Institute of Physics in Bratislava, Slovakia) have found a way to preserve quantum entanglement of particles passing through an amplifier and, conversely, when transmitting a signal over long distances. Details are provided in an article published in the journal Physical Review A.

The laws of quantum mechanics do not allow for the teleportation of objects and people, but it is already possible to quantum teleport single photons and atoms, which opens up exciting opportunities for the creation of new computing devices and communication lines. Due to specific quantum effects, a quantum computer will be able to efficiently solve certain problems, for example, hacking codes used in banking, but for now it is still just a theoretical possibility. In practice, quantum computing and teleportation are obstructed by a process called decoherence.

Decoherence is the destruction of the quantum state due to the interaction of a quantum system with the outside world. For experiments in quantum computing, scientists use single atoms caught in magnetic traps and cooled to temperatures close to absolute zero. After going through kilometers of fiber, photons cease to be quantum entangled in most cases and become ordinary, unrelated light quanta.


To create an effective quantum computing system, scientists have to solve a number of problems, including preserving quantum entanglement when the signal abates and when it passes through an amplifier. Fiber-optic cables on the ocean bed contain a great deal of special amplifiers composed of optical glass and rare earth elements. It is these amplifiers that make it possible to watch high-resolution videos stored on a server in California from the MIPT campus or a university in Beijing.


In their article, Filippov and Ziman say that a certain class of signals can be transmitted so that the risk ofruining quantum entanglement becomes much lower. In this case, neither the attenuation nor the amplification of a signal ruins the entanglement. To achieve this effect, it is necessary to have the particles in a special, non-Gaussian state, or, as physicists put it, "the wave function of the particles in the coordinate representation should not be in the form of a Gaussian wave packet." A wave function is a basic concept of quantum mechanics, and Gaussian distribution is a major mathematical function used not only by physicists but also by statisticians, sociologists and economists.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Bats use polarized light to calibrate their internal magnetic compass

Bats use polarized light to calibrate their internal magnetic compass | Amazing Science | Scoop.it

Scientists have discovered that greater mouse-eared bats use polarization patterns in the sky to navigate -- the first mammal that's known to do this.


The bats use the way the Sun's light is scattered in the atmosphere at sunset to calibrate their internal magnetic compass, which helps them to fly in the right direction, a study published in Nature Communications has shown.


Despite this breakthrough, researchers have no idea how they manage to detect polarized light. 'We know that other animals use polarization patterns in the sky, and we have at least some idea how they do it: bees have specially-adapted photoreceptors in their eyes, and birds, fish, amphibians and reptiles all have cone cell structures in their eyes which may help them to detect polarization,' says Dr Richard Holland of Queen's University Belfast, co-author of the study.


'But we don't know which structure these bats might be using.' Polarization patterns depend on where the sun is in the sky. They're clearest in a strip across the sky 90° from the position of the sun at sunset or sunrise. But animals can still see the patterns long after sunset. This means they can orient themselves even when they can't see the sun, including when it's cloudy. Scientists have even shown that dung beetles use the polarization pattern of moonlight for orientation.


A hugely diverse range of creatures – including bees, anchovies, birds, reptiles and amphibians – use the patterns as a compass to work out which way is north, south, east and west.

more...
M. Philip Oliver's curator insight, July 23, 11:48 AM

Thanks to Dr. Stefan

Scooped by Dr. Stefan Gruenwald
Scoop.it!

China plans particle colliders that would completely dwarf CERN's Large Hadron Collider

China plans particle colliders that would completely dwarf CERN's Large Hadron Collider | Amazing Science | Scoop.it

The 27-kilometer Large Hadron Collider at CERN could soon be overtaken as the world’s largest particle smasher by a proposed Chinese machine. Proposals for two accelerators could see country become collider capital of the world.


For decades, Europe and the United States have led the way when it comes to high-energy particle colliders. But a proposal by China that is quietly gathering momentum has raised the possibility that the country could soon position itself at the forefront of particle physics.


Scientists at the Institute of High Energy Physics (IHEP) in Beijing, working with international collaborators, are planning to build a ‘Higgs factory’ by 2028 — a 52-kilometre underground ring that would smash together electrons and positrons. Collisions of these fundamental particles would allow the Higgs boson to be studied with greater precision than at the much smaller Large Hadron Collider (LHC) at CERN, Europe’s particle-physics laboratory near Geneva, Switzerland.


Physicists say that the proposed US$3-billion machine is within technological grasp and is considered conservative in scope and cost. But China hopes that it would also be a stepping stone to a next-generation collider — a super proton–proton collider — in the same tunnel.


European and US teams have both shown interest in building their own super collider (see Nature 503, 177; 2013), but the huge amount of research needed before such a machine could be built means that the earliest date either can aim for is 2035. China would like to build its electron–positron collider in the meantime, unaided by international funding if needs be, and follow it up as fast as technologically possible with the super proton collider. Because only one super collider is likely to be built, China’s momentum puts it firmly in the driving seat.


Speaking this month at the International Conference on High Energy Physics in Valencia, Spain, IHEP director Yifang Wang said that, to secure government support, China wanted to work towards a more immediate goal than a super collider by 2035. “You can’t just talk about a project which is 20 years from now,” he said.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

CRISPR: the next generation of genome editing tools

CRISPR: the next generation of genome editing tools | Amazing Science | Scoop.it

An arms race has been waged between bacteria and bacteriophage that would bring a satisfactory tear to Sun Tzu’s eye. Scientists have recently recognized that countermeasures developed by bacteria (and archaea) in response to phage infections can be retooled for use within molecular biology. In 2013, large strides have been made to co-opt this system (specifically and most commonly from Streptococcus pyogenes) for use in mammalian cells. This countermeasure, CRISPR (clustered regularly interspaced short palindromic repeats), has brought about another successive wave of genome engineering initiated by recombineering and followed more recently by zinc finger nucleases (ZFNs) and transcription activator-like effector nucleases (TALENs).


ZFNs and TALENs perform a similar function yet the learning curve appears to be more difficult for development due to the use of protein-DNA contacts rather than the simplicity of designing RNA-DNA homology contacts. Although the potential for CRISPR in regards to genome editing within mammalian cells will be of greatest interest to the reader, the CRISPR backstory is equally compelling. Just as we have evolved immune responses to pathogens, so too have bacteria. CRISPR is an adapted immune response evolved by bacteria to create an immunological memory to ward off future phage infections. When a phage infects and injects its DNA within a bacterium, the DNA commandeers bacterial proteins and enzymes for use towards lytic or lysogenic phases. However, exposure of phage DNA allows the bacterium to copy and insert snippets (called spacers) of phage DNA into its genomic DNA between direct repeats (DR). These snippets can later be expressed as an operon (pre-CRISPR RNA, pre-crRNA) alongside a trans-activating CRISPR RNA (tracrRNA) and an effector CRISPR associated nuclease (Cas). Together these components surveil for foreign crRNA cognate sequence and cleave the targeted sequence.


Although hallmarks of CRISPR have been known since the late 80’s (CRISPR timeline) and was acronymed in 2002, Jinek et al. in August 2012 were the first to suggest the suitability of CRISPR towards genome editing. In February of 2013, Feng Zhang’s and George Church’s labs simultaneously published the first papers describing the use long oligos/constructs for editing via CRISPR in mammalian cells and made their plasmids readily available on Addgene. Zhang’s lab went one step further and has supplemented their papers with a helpful website and user forum. They have even gone so far as to publish a methods paper to streamline the use of their plasmids towards a plug-and-play, modular cloning approach with your target sequence of interest.


CRISPR works fairly well out of the box yet still has some imperfections that are being addressed. For example, CRISPR relies upon a protospacer adjacent motif (PAM; S. pyogenes sequence: NGG) 3’ to the targeting sequence to permit digestion. Although the ubiquity of NGG within the genome may seem advantageous, it may be limiting in some regions. Other species make use of different PAM sites that can be considered when choosing a cut sites of interest. Since double-stranded cuts could potentially create DNA lesions (a byproduct of the cell using non-homologous end joining [NHEJ] instead of homologous recombination) some labs are choosing to use modified Cas enzymes that nick DNA, instead of creating a double-strand break. This potential weakness of CRISPR to create DNA lesions via NHEJ, however, has been exploited by Eric Lander’s and Zhang’s lab this month (Jan. 2014). They have capitalized on the cell’s use of NHEJ to manufacture DNA lesions (frameshift mutations) at cut sites within genes on a large scale as a means to perform large genetic screens. Using this technique knocks out a gene and has the obvious advantage of fully ablating a gene’s expression compared to RNAi where some residual expression can be expected.


The advantages of CRISPR lends itself to future therapies. High efficiency, low-to-no background mutagenesis and easy construction put CRISPR front and center as the tool de jour for gene therapy. In combination with induced pluripotent stem cells (iPSCs), one can imagine the creation of patient-specific iPSCs created with non-integrative iPSC vectors and modified by CRISPR, devoid of any residual DNA footprint left behind by the iPSC vector or CRISPR correction. In conjunction with whole genome sequencing, genetically clean cell lines can be selected that are suitable for differentiation towards the germ layer of interest for subsequent autologous transplantation. Proof of principle experiments have already been published in models of cystic fibrosis and cataracts.


For better or worse, CRISPR is catching on like wildfire with young investigators, as noted recently by Michael Eisen. What may be looming in the future and not as openly discussed at this time is the potential for CRISPR to open up the genome to large scale editing. We tend to think of any particular genome as fairly static with slight variations between any two individuals and increased variation down the evolutionary line. However, CRISPR has proven to be a fantastic multitasker, capable of modifying multiple loci in one fell swoop as demonstrated by the Jaenisch lab (five loci). With the creation of Caribou Biosciences and a surprising round of venture capital raised by a powerhouse team at Editas Medicine in November ($43 million), CRISPR appears to also have sparked an interest in the private sector. With large sums of money at their disposal, these companies can now begin to look at the genome, not as a static entity, but more akin to operating system, a code that now has a facile editing tool. George Church, an Editas co-founder, has speculated in the past about the potential use of the human genome as the backbone for recreating the Neanderthal genome in his recent book and interview with "Der Spiegel". In an era where the J. Craig Venter Institute can create an organism’s genome de novo and a collaboration between Synthetic Genomics and Integrated DNA Technologies has proposed to synthesize DNA upwards of 2Mbp, the combination of CRISPR, synthetic DNA and some elbow grease will make the genome more accessible and Church’s speculations a potential reality.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

3D-printing may revolutionize medical education

3D-printing may revolutionize medical education | Amazing Science | Scoop.it

A kit of 3D-printed anatomical body parts could revolutionize medical education and training, according to its developers at Monash University.

Professor Paul McMenamin, Director of the University’s Centre for Human Anatomy Education, said the simple and cost-effective anatomical kit would dramatically improve trainee doctors’ and other health professionals’ knowledge and could even contribute to the development of new surgical treatments.


“Many medical schools report either a shortage of cadavers, or find their handling and storage too expensive as a result of strict regulations governing where cadavers can be dissected,” he said.


“Without the ability to look inside the body and see the muscles, tendons, ligaments, and blood vessels, it’s incredibly hard for students to understand human anatomy. We believe our version, which looks just like the real thing, will make a huge difference.”


The 3D Printed Anatomy Series kit, to go on sale later this year, could have particular impact in developing countries where cadavers aren’t readily available, or are prohibited for cultural or religious reasons.


After scanning real anatomical specimens with either a CT or surface laser scanner, the body parts are 3D printed either in a plaster-like powder or in plastic, resulting in high resolution, accurate color reproductions.


Further details have been published online in the journal Anatomical Sciences Education.

more...
ChemaCepeda's curator insight, July 23, 4:22 AM

La impresión 3D también va a mejorar la manera en que nos formamos los profesionales sanitarios

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Organogenesis in a dish: Modeling development and disease using organoid technologies

Organogenesis in a dish: Modeling development and disease using organoid technologies | Amazing Science | Scoop.it

Organoids have been generated for a number of organs from both mouse and human stem cells. To date, human pluripotent stem cells have been coaxed to generate intestinal, kidney, brain, and retinal organoids, as well as liver organoid-like tissues called liver buds.


Derivation methods are specific to each of these systems, with a focus on recapitulation of endogenous developmental processes. Specifically, the methods so far developed use growth factors or nutrient combinations to drive the acquisition of organ precursor tissue identity.


Then, a permissive three-dimensional culture environment is applied, often involving the use of extracellular matrix gels such as Matrigel. This allows the tissue to self-organize through cell sorting out and stem cell lineage commitment in a spatially defined manner to recapitulate organization of different organ cell types.


These complex structures provide a unique opportunity to model human organ development in a system remarkably similar to development in vivo. Although the full extent of similarity in many cases still remains to be determined, organoids are already being applied to human-specific biological questions. Indeed, brain and retinal organoids have both been shown to exhibit properties that recapitulate human organ development and that cannot be observed in animal models. Naturally, limitations exist, such as the lack of blood supply, but future endeavors will advance the technology and, it is hoped, fully overcome these technical hurdles.

Outlook: The therapeutic promise of organoids is perhaps the area with greatest potential. These unique tissues have the potential to model developmental disease, degenerative conditions, and cancer. Genetic disorders can be modeled by making use of patient-derived induced pluripotent stem cells or by introducing disease mutations. Indeed, this type of approach has already been taken to generate organoids from patient stem cells for intestine, kidney, and brain.


Furthermore, organoids that model disease can be used as an alternative system for drug testing that may not only better recapitulate effects in human patients but could also cut down on animal studies. Liver organoids, in particular, represent a system with high expectations, particularly for drug testing, because of the unique metabolic profile of the human liver. Finally, tissues derived in vitro could be generated from patient cells to provide alternative organ replacement strategies. Unlike current organ transplant treatments, such autologous tissues would not suffer from issues of immunocompetency and rejection.

more...
No comment yet.