Amazing Science
677.2K views | +144 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Newly declared species may have been the largest flying bird that ever lived

Newly declared species may have been the largest flying bird that ever lived | Amazing Science | Scoop.it
After decades with the title, an extinct bird loses its claim to the widest wing span in history.


When South Carolina construction workers came across the giant, winged fossil at the Charleston airport in 1983, they had to use a backhoe to pull the bird, which lived about 25 million years ago, up from the earth.


But if the bird was actually a brand-new species, researchers faced a big question: Could such a large bird, with a wingspan of 20 to 24 feet, actually get off the ground? After all, the larger the bird, the less likely its wings are able to lift it unaided.


The answer came from Dan Ksepka, paleontologist and science curator at the Bruce Museum in Greenwich, Conn.


He modeled a probable method of flight for the long-extinct bird, named as a new species this week in the Proceedings of the National Academy of Sciences. If Ksepka’s simulations are correct, Pelagornis sandersi would be the largest airborne bird ever discovered.


Pelagornis sandersi relied on the ocean to keep it aloft. Similar in many ways to a modern-day albatross — although with at least twice the wingspan and very different in appearance, Ksepka said — the bird probably needed a lot of help to fly. It had to run downhill into a head wind, catching the air like a hang glider. Once airborne, it relied on air currents rising from the ocean to keep it gliding.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

New Technique Provides a Clear and Rapid Means of Classifying Supernova Remnants

New Technique Provides a Clear and Rapid Means of Classifying Supernova Remnants | Amazing Science | Scoop.it

By observing specific X-ray emissions from iron atoms in the core of supernova remnants, astronomers developed a new technique that provides a clear and rapid means of classifying supernova remnants.


An international team of astronomers using data from the Japan-led Suzaku X-ray observatory has developed a powerful technique for analyzing supernova remnants, the expanding clouds of debris left behind when stars explode. The method provides scientists with a way to quickly identify the type of explosion and offers insights into the environment surrounding the star before its destruction.


“Supernovae imprint their remnants with X-ray evidence that reveals the nature of the explosion and its surroundings,” said lead researcher Hiroya Yamaguchi, an astrophysicist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “Thanks to Suzaku, we are now learning how to interpret these signals.”


The technique involves observing specific X-ray emissions from iron atoms in the core of supernova remnants. Even after thousands of years, these atoms remain extremely hot, stripped of most of the 26 electrons that accompany iron atoms under normal conditions on Earth. The metal is formed in the centers of shattered stars toward the end of their energy-producing lives and in their explosive demise, which makes it a key witness to stellar death.


“Because Suzaku has a better sensitivity to iron emission lines than any other X-ray mission, it’s the ideal tool for investigating supernova remnants at these energies,” said Robert Petre, chief of Goddard’s X-ray Astrophysics Laboratory and a member of the study team. Suzaku was launched into Earth orbit in 2005, the fifth in a series of Japanese X-ray astronomy satellites. It was developed and is operated cooperatively by the United States and Japan.


Astronomers estimate that a supernova occurs once or twice a century in our home galaxy, the Milky Way. Each time, a blast wave and a shell of hot stellar debris expands rapidly away from the detonation, creating a supernova remnant that can be detected for tens of thousands of years. The expanding cloud slows over time as it mixes with interstellar gas and eventually becomes indistinguishable from it.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Parasitic wasp turns roaches into zombie slaves using a neurotoxic cocktail

Parasitic wasp turns roaches into zombie slaves using a neurotoxic cocktail | Amazing Science | Scoop.it

For decades, scientists have tried to understand the complex and gruesome relationship between the parasitic emerald wasp Ampulex compressa and its much larger victim, the common household cockroach Periplaneta americana.


At first glance, this parasite-prey relationship seems much like any other: the female wasp stings the cockroach, lays an egg on its abdomen, and once hatched, the hungry larva feeds on the cockroach. However, while most parasitic insects tend to paralyse their victims with a venomous sting, the emerald wasp instead manipulates the cockroach’s behaviour, essentially transforming it into a zombie slave.



With two stings the cockroach is left with the ability to walk, but is entirely robbed of the power to initiate its own movement. The wasp, now tired after administering two stings, regains its energy by cutting off the ends of the cockroach’s antennae, and drinking its blood. Revitalised, it then latches on to the stung cockroach’s antennae and, much like an obedient toddler being lead to his first day of school, the submissive insect follows the wasp’s orders.


The first sting, administered to a mass of nerve tissue in the cockroach’s thorax, contains large quantities of gamma amino butyric acid (GABA), and complementary chemicals called taurine and beta alanine. GABA is a neurotransmitter that blocks the transmission of motor signals between nerves, and, together with the other two chemicals, it temporarily paralyses the cockroach’s front legs. This prevents the cockroach from escaping while the wasp inflicts the second, more toxic sting directly into the roach’s brain.


It is the second sting that turns the cockroach into a zombie, and contains what Frederic Libersat and his colleagues at Ben Gurion University refer to as a “neurotoxic cocktail”. The venom of the second sting blocks the receptors for another neurotransmitter called octopamine, which is involved in the initiation of spontaneous and complex movements such as walking.


Libersat has shown that unstung cockroaches injected with an octopamine-like compound show an increase in walking behaviour. Those injected with a chemical that blocks octopamine, however, show a reduction in spontaneous walking, much like the victims of the wasp sting. Zombie cockroaches were also able to recover from their stupor and walk after they were injected with a chemical that reactivates octopamine receptors.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Size of the human genome reduced to 19,000 genes

Size of the human genome reduced to 19,000 genes | Amazing Science | Scoop.it

A study led by Alfonso Valencia, Vice-Director of Basic Research at the Spanish National Cancer Research Centre (CNIO) and head of the Structural Computational Biology Group, and Michael Tress, researcher at the Group, updates the number of human genes -those that can generate proteins- to 19,000; 1,700 fewer than the genes in the most recent annotation, and well below the initial estimations of 100,000 genes. The work, published in the journal Human Molecular Genetics, concludes that almost all of these genes have ancestors prior to the appearance of primates 50 million years ago.


"The shrinking human genome," that's how Valencia describes the continuous corrections to the numbers of the protein-coding genes in the human genome over the years that has culminated in the approximately 19,000 human genes described in the present work. "The coding part of the genome [which produces proteins] is constantly moving," he adds: "No one could have imagined a few years ago that such a small number of genes could make something so complex."


The scientists began by analysing proteomics experiments; proteomics is the most powerful tool to detect protein molecules. In order to determine a map of human proteins the researchers integrated data from seven large-scale mass spectrometry studies, from more than 50 human tissues, "in order to verify which genes really do produce proteins " says Valencia.


The results brought to light just over 12,000 proteins and the researchers mapped these proteins to the corresponding regions of the genome. They analysed thousands of genes that were annotated in the human genome, but that did not appear in the proteomics analysis and concluded: "1,700 of the genes that are supposed to produce proteins almost certainly do not for various reasons, either because they do not exhibit any protein coding features, or because the conservation of their reading frames does not support protein coding ability, "says Tress.


One hypothesis derived from the study is that more than 90% of human genes produce proteins that originated in metazoans or multicellular organisms of the animal kingdom hundreds of millions of years ago; the figure is over 99% for those genes whose origin predates the emergence of primates 50 million years ago.


"Our figures indicate that the differences between humans and primates at the level of genes and proteins are very small," say the researchers. David Juan, author and researcher in the Valencia lab, says that "the number of new genes that separate humans from mice [those genes that have evolved since the split from primates] may even be fewer than ten." This contrasts with the more than 500 human genes with origins since primates that can be found in the current annotation. The researchers conclude: "The physiological and developmental differences between primates are likely to be caused by gene regulation rather than by differences in the basic functions of the proteins in question."


The sources of human complexity lie more in how genes are used rather than on the number of genes, in the thousands of chemical changes that occur in proteins or in the control of the production of these proteins by non-coding regions of the genome, which comprise 90% of the entire genome and which have been described in the latest findings of the international ENCODE project, a Project in which the Valencia team participates.


The work brings the number of human genes closer to other species such as the nematode worms Caenorhabditis elegans, worms that are just 1mm long, but apparently less complex than humans. But Valencia prefers not to make comparisons: "The human genome is the best annotated, but we still believe that 1,700 genes may have to be re-annotated. Our work suggests that we will have to redo the calculations for all genomes, not only the human genome."


The research results are part of GENCODE, a consortium which is integrated into the ENCODE Project and formed by research groups from around the world, including the Valencia team, whose task is to provide an annotation of all the gene-based elements in the human genome.


"Our data are being discussed by GENCODE for incorporation into the new annotations. When this happens it will redefine the entire mapping of the human genome, and how it is used in macro projects such as those for cancer genome analysis ," says Valencia.

more...
Laura E. Mirian, PhD's curator insight, July 8, 2014 10:42 AM

"Our figures indicate that the differences between humans and primates at the level of genes and proteins are very small," say the researchers. 

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Small, but plentiful: how the faintest galaxies illuminated the early universe

Small, but plentiful: how the faintest galaxies illuminated the early universe | Amazing Science | Scoop.it

Light from tiny galaxies over 13 billion years ago played a larger role than previously thought in creating the conditions in the universe as we know it today, a new study has found. Ultraviolet (UV) light from stars in these faint dwarf galaxies helped strip interstellar hydrogen of electrons in a process called re-ionization.


The epoch of re-ionization began about 200 million years after the Big Bang and astrophysicists agree that it took about 800 million more for the entire universe to become re-ionized. It marked the last major phase transition of gas in the universe, and it remains ionized today.


Astrophysicists aren’t in agreement when it comes to determining which type of galaxies played major roles in this epoch. Most have focused on large galaxies. However, a new theory by researchers at the Georgia Institute of Technology and the San Diego Supercomputer Center indicates scientists should also focus on the smallest.  The findings are reported in a paper published today in the journal Monthly Notices of the Royal Astronomical Society.


The researchers used computer simulations to demonstrate the faintest and smallest galaxies in the early universe were essential. These tiny galaxies – despite being 1000 times smaller in mass and 30 times smaller in size than the Milky Way – contributed nearly 30 percent of the UV light during this process.


Re-ionization experts often ignored these dwarf galaxies because they didn’t think they formed stars. It is assumed that UV light from nearby galaxies was too strong and suppressed these tiny neighbors.


“It turns out they did form stars, usually in one burst, around 500 million years after the Big Bang,” said John Wise, a Georgia Tech assistant professor in the School of Physics who led the study. “The galaxies were small, but so plentiful that they contributed a significant fraction of UV light in the re-ionization process.”


The team’s simulations modeled the flow of UV stellar light through the gas within galaxies as they formed. They found that the fraction of ionizing photons escaping into intergalactic space was 50 percent in small (more than 10 million solar masses) halos. It was only 5 percent in larger halos (300 million solar masses).  This elevated fraction, combined with their high abundance, is exactly the reason why the faintest galaxies play an integral role during re-ionization.


“It’s very hard for UV light to escape galaxies because of the dense gas that fills them,” said Wise. “In small galaxies, there’s less gas between stars, making it easier for UV light to escape because it isn’t absorbed as quickly. Plus, supernova explosions can open up channels more easily in these tiny galaxies in which UV light can escape.”


The team’s simulation results provide a gradual timeline that tracks the progress of re-ionization over hundreds of millions of years. About 300 million years after the Big Bang, the universe was 20 percent ionized. It was 50 percent at 550 million years. The universe was fully ionized at 860 million years after its creation.



more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Newly spotted frozen world orbits in a binary star system

Newly spotted frozen world orbits in a binary star system | Amazing Science | Scoop.it

A newly discovered planet in a binary star system located 3,000 light-years from Earth is expanding astronomers’ notions of where Earth-like—and even potentially habitable—planets can form, and how to find them.


At twice the mass of Earth, the planet orbits one of the stars in the binary system at almost exactly the same distance from which Earth orbits the sun. However, because the planet’s host star is much dimmer than the sun, the planet is much colder than the Earth—a little colder, in fact, than Jupiter’s icy moon Europa.


Four international research teams, led by professor Andrew Gould of The Ohio State University, published their discovery in the July 4 issue of the journal Science.


The study provides the first evidence that terrestrial planets can form in orbits similar to Earth’s, even in a binary star system where the stars are not very far apart. Although this planet itself is too cold to be habitable, the same planet orbiting a sun-like star in such a binary system would be in the so-called “ habitable zone” —the region where conditions might be right for life.


“This greatly expands the potential locations to discover habitable planets in the future,” said Scott Gaudi, professor of astronomy at Ohio State. “Half the stars in the galaxy are in binary systems. We had no idea if Earth-like planets in Earth-like orbits could even form in these systems. ”


more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Oldest case of Down's syndrome from medieval France

Oldest case of Down's syndrome from medieval France | Amazing Science | Scoop.it

The oldest confirmed case of Down's syndrome has been found: the skeleton of a child who died 1500 years ago in early medieval France. According to the archaeologists, the way the child was buried hints that Down's syndrome was not necessarily stigmatized in the Middle Ages.

Down's syndrome is a genetic disorder that delays a person's growth and causes intellectual disability. People with Down's syndrome have three copies of chromosome 21, rather than the usual two. It was described in the 19th century, but has probably existed throughout human history. However there are few cases of Down's syndrome in the archaeological record.

The new example comes from a 5th- and 6th-century necropolis near a church in Chalon-sur-Saône in eastern France. Excavations there have uncovered the remains of 94 people, including the skeleton of a young child with a short and broad skull, a flattened skull base and thin cranial bones. These features are common in people with Down's syndrome, says Maïté Rivollat at the University of Bordeaux in France, who has studied the skeleton with her colleagues.

"I think the paper makes a convincing case for a diagnosis of Down's syndrome," says John Starbuck at Indiana University in Indianapolis. He has just analyzed a 1500-year-old figurine from the Mexican Tolteca culture that he says depicts someone with Down's syndrome.

A similar argument was put forward in a 2011 study that described the 1500-year-old burial in Israel of a man with dwarfism (International Journal of Osteoarchaeology, DOI: 10.1002/oa.1285). The body was buried in a similar manner to others at the site, and archaeologists took that as indicating that the man was treated as a normal member of society.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Global Warming: Failing To Control Earth’s CO2

Global Warming: Failing To Control Earth’s CO2 | Amazing Science | Scoop.it

The whole world gathered in Copenhagen recently for the XV COP for Climatic Changes. On the agenda was how to cope with the rise in CO2 emissions, which, in addition to ocean acidification, could elevate the ocean level as much as 60 cm by the end of the century.


This will jeopardize those living on islands and along shorelines – it’s estimated 100 million people may be menaced. In fact, humans are pumping 7 Gt of COin the atmosphere yearly. The level of COin the atmosphere today is around 370 ppm – according to specialists, it needs to remain below 420 ppm through the end of this century to keep global warming below 2oC. Most solutions to reduce this trend are not short-term ones. An integrated approach to carbon abatement in the automotive sector could reduce global passenger vehicle greenhouse emissions by 2.2 Gt by 2030, much of it using proven technologies. Sugarcane-based ethanol produced in Brazil on 8 million Ha can be substantially increased, but that must be done without harming the environment. The ethanol produced from 200 million tons of corn in the US will help reduce greenhouse emission by car. Together both countries supply today only a fraction of what will be needed to replace the automotive fossil fuel in years to come.


This means abatement will not come from first-generation biofuels alone, but from a combination of second generation biofuel, traffic flow shifts and a mix of several other technologies. Carbon capture and storage can handle a few million metric tons of CO2 /year, while 6 billion metric tons of coal are burned each year, producing 18 billion tons of CO2.


Brazil hopes to revert deforestation in the Amazon that in the last decades claimed an area larger than Germany, according to the National Institute of Air Space – INPE. To accomplish this, the National Plan of Climatic Changes in Brazil was presented in Copenhagen, and it included efforts to achieve reforestation by 2020. This is a costly and long-term effort.


But deforestation is not a problem of the tropical forest alone. The vegetation of other ecosystem have been drastically reduced. There is just 7% of the original vegetation of Mata Atlântica left. The “Cerrado” is being destroyed at a rate of 0.5% a year. Inadequate use of this biome, for ethanol production, for instance, could destroy the 17% remaining of the Cerrado.


So what can be done if the level of CO2 cannot be kept under control? Geo-engineering proposes simulated volcanic eruptions to reduce the planet temperature and the level of ocean rise, based uponobservations made after Mount Pinatubo volcanic eruption in June 1991. The eruption injected 10 Tg S in the stratosphere which caused detectable short-term cooling of the planet. One simulated injection of SO2 as an aerosol precursor equivalent to the Mount Pinatubo eruption every two years would cool the planet and consequently keep the sea level rise below 20 cm for centuries ahead, although the (relatively less deadly) ocean acidification due to CO would persist.


I attended several discussions on this subject where most people accepted this fate, like lambs to the slaughterhouse. I proposed a strategy to desalinize sea water for irrigation or as a source of potable water where water is needed most: arid regions of developing countries. If the ocean level rises at a rate of 6 mm/year and since oceans occupy 360 x 106 million Km2, the amount of water to be desalinized is 2.16 x 1012 m3. Considering that there is at least 10% of arid regions in the planet, this amount of water corresponds to only 14 mm of rain falling in 15 million square km2.


So the amount of desalinized water from ocean rise alone may be insufficient to irrigate adequately large areas. Desalinized water could also be stored in reservoirs and underground aquifers. Potable water is scarce in many regions of the world, particularly in the Sub Sahara. Lack of good quality potable water threatens today the lives of 1.1 billion, according to UNEP worldwide, due to infections resulting from unclean drinking water. Throughout most of the world, the most common contamination of raw water sources is from human sewage and in particular human faecal pathogens and parasites. In 2006,waterborne diseases were estimated to cause 1.8 million deaths each year, while about 1.1 billion people lacked proper drinking water. Thus, it is clear that people in the developing world need to have access to good quality water in sufficient quantity, be able to purify water and distribute it.


Most desalination plants yield around 107  m3 of desalinized water annually, in recent years. Alternative technologies may be needed to allow for desalination of 2.16 x 1012 m3/year, the equivalent to 6mm of ocean rise/year. A project alone that desalinizes water from the Red Sea in Jordan has the capacity to produce 850 million m3 of desalinated water/year. That’s 10 times the yields of the recent past and the cost will be more than $10 billion but will benefit Israel, Jordan and the Palestinian Authority. Under the leadership of the Ministry of Water and Irrigation of Jordan, the project may need to gather funds of close to $40 billion for its complete implementation; this is achievable if additional bidders come aboard.


The project will stand as a symbol of peace and cooperation in the Middle East. One project alone yielding 8.5 times 10 to the eight means 10,000 projects of this magnitude are needed. Ted Levin from the Natural Resources Defense Council says that more than 12,000 desalination plants already supply fresh water in 120 nations, mostly in The Middle East and Caribbean. The market for desalination according to analysts will grow substantially over the next decades.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

More than 99% of drug trials for Alzheimer's disease during the past decade have failed

More than 99% of drug trials for Alzheimer's disease during the past decade have failed | Amazing Science | Scoop.it

There is an urgent need to increase the number of potential therapies being investigated, say US scientists. Only one new medicine has been approved since 2004, they report in the journal Alzheimer's Research & Therapy.


The drug failure rate is troubling and higher than for other diseases such as cancer, says Alzheimer's Research UKDr Jeffrey Cummings, of the Cleveland Clinic Lou Ruvo Center for Brain Health, in Las Vegas, and colleagues, examined a public website that records clinical trials.


Between 2002 and 2012, they found 99.6% of trials of drugs aimed at preventing, curing or improving the symptoms of Alzheimer's had failed or been discontinued. This compares with a failure rate of 81% for cancer drugs.


The failure rate was "especially troubling" given the rising numbers of people with dementia, said Dr Simon Ridley, of Alzheimer's Research UK. "The authors of the study highlight a worrying decline in the number of clinical trials for Alzheimer's treatments in more recent years," he said.


"There is a danger that the high failure rates of trials in the past will discourage pharmaceutical companies from investing in dementia research.


"The only way we will successfully defeat dementia is to continue with high quality, innovative research, improve links with industry and increase investment in clinical trials."


more...
Sandy Spencer's curator insight, July 6, 2014 9:37 AM

This is so discouraging. I know everyone has high hopes of a cure or at least something to slow it down. But our wait goes on--

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Scientists discovers Achilles' heel in antibiotic-resistant bacteria

Scientists discovers Achilles' heel in antibiotic-resistant bacteria | Amazing Science | Scoop.it
Scientists at the University of East Anglia have made a breakthrough in the race to solve antibiotic resistance.


New research published today in the journal Nature reveals an Achilles' heel in the defensive barrier which surrounds drug-resistant bacterial cells.

The findings pave the way for a new wave of drugs that kill superbugs by bringing down their defensive walls rather than attacking the bacteria itself. It means that in future, bacteria may not develop drug-resistance at all.


The discovery doesn't come a moment too soon. The World Health Organization has warned that antibiotic-resistance in bacteria is spreading globally, causing severe consequences. And even common infections which have been treatable for decades can once again kill.


Researchers investigated a class of bacteria called 'Gram-negative bacteria' which is particularly resistant to antibiotics because of its cells' impermeable lipid-based outer membraneThis outer membrane acts as a defensive barrier against attacks from the human immune system and antibiotic drugs. It allows the pathogenic bacteria to survive, but removing this barrier causes the bacteria to become more vulnerable and die.


Until now little has been known about exactly how the defensive barrier is built. The new findings reveal how bacterial cells transport the barrier building blocks (called lipopolysaccharides) to the outer surface. Group leader Prof Changjiang Dong, from UEA's Norwich Medical School, said: "We have identified the path and gate used by the bacteria to transport the barrier building blocks to the outer surface. Importantly, we have demonstrated that the bacteria would die if the gate is locked."


"This is really important because drug-resistant bacteria is a global health problem. Many current antibiotics are becoming useless, causing hundreds of thousands of deaths each year.


"The number of super-bugs are increasing at an unexpected rate. This research provides the platform for urgently-needed new generation drugs." Lead author PhD student Haohao Dong said: "The really exciting thing about this research is that new drugs will specifically target the protective barrier around the bacteria, rather than the bacteria itself.


"Because new drugs will not need to enter the bacteria itself, we hope that the bacteria will not be able to develop drug resistance in future."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Soft-Robotics: The robots of the future won't look anything like the Terminator

Soft-Robotics: The robots of the future won't look anything like the Terminator | Amazing Science | Scoop.it

The field of soft robotics has attracted a rush of attention in the last year. Down the road at Harvard, multiple groups are working on soft robotic hands, jumping legs, exosuits, and quadrupeds that can do the limbo. At Worcester Polytechnic Institute's Soft Robotics Lab, researchers are building a snake. In San Francisco, a startup called Otherlab is buildinginflatable robots that can shake hands, walk, and carry riders. In Italy, a group of researchers built a robotic tentacle modeled after an octopus.

Before the 1970s, car companies made cars safer by making them larger and heavier. Then along came the airbag: a lightweight safety device that folded up invisibly into the vehicle until it sensed a crash. Similar revolutions took place with body armor, bridges, and contact lenses, and these researchers believe something similar is happening with robots.

"It’s not a part of conventional robotics technologies," says Fumiya Iida, a professor of bio-inspired robotics at the Swiss Federal Institute of Technology-Zurich and a member of the IEEE committee on soft robotics. "They have to think completely differently, use different materials, different energy sources. Definitely this is the way we should go in the long run." One of the most impressive rigid robots in the world right now is Boston Dynamics’ 300-pound humanoid Atlas. If Atlas wants to pick up a ball, it needs to sense and compute the precise distance between its digits and the ball and figure out exactly where to place its hand and how much pressure to apply.


Robots like Atlas "are doing a lot of thinking," says Barry Trimmer, PhD, a professor at Tufts and the editor of a new journal, Soft Robotics, which launched last month. "There’s a lot of hesitancy. ‘Where do I put my foot next?’ Animals just don't do that. We need to get away from the idea that you have to control every variable."

By contrast, Harvard’s starfish-shaped soft gripper only needs to be told to inflate. As it’s pumped full of air, it conforms to the shape of an object until its "fingers" have enough pressure to lift it. Another example would be a human picking up a glass of water. We don’t have to compute the exact size and shape of the glass with our brains; our hand adapts to the object. Similarly, Bubbles doesn’t calculate the full length of its movement.


There are technological challenges as well. In addition to air and fluid pressure actuators, soft roboticists are experimenting with dielectric elastomers, elastic materials that expand and contract in response to electric voltage; shape-memory alloys, metal alloys that can be programmed to change shape at certain temperatures; and springs that respond to light. These approaches are still rudimentary, as are the control systems that operate the robots. In the case of many of Harvard’s soft robots, it’s simply a syringe of air attached to a tube.


The field is so new, however, that no possibilities have yet been ruled out. Soft robotics technologies could theoretically be used in a wearable pair of human wings.More practically, soft robots could easily pack eggs or pick fruit — traditional hard robots, equipped with superhuman grips, are more likely to break yolks and inadvertently make applesauce. A mass of wormlike "meshworm" robots could be filled with water and dropped over a disaster area, where they would crawl to survivors. A soft robotic sleeve could be worn to eliminate tremors or supplement strength lost with age. Soft robots could be used in space exploration, where weight is hugely important; in prosthetics, where they would provide comfort and lifelikeness; in the home, where they can help out around the house without trampling the dog; and in surgical robots, where operators have inspired a few lawsuits after puncturing patients’ insides.


more...
Rudolf Kabutz's curator insight, July 3, 2014 6:38 AM

Do robots have to be hard and metallic? Soft spongy robots could have many advantages.

Anne Pascucci, MPA, CRA's curator insight, July 3, 2014 8:44 AM

Very cool!

Scooped by Dr. Stefan Gruenwald
Scoop.it!

19th Century Jacobi Math Tactic Gets a Makeover—and Yields Answers Up to 200 Times Faster

19th Century Jacobi Math Tactic Gets a Makeover—and Yields Answers Up to 200 Times Faster | Amazing Science | Scoop.it

A relic from long before the age of supercomputers, the 169-year-old math strategy called the Jacobi iterative method is widely dismissed today as too slow to be useful. But thanks to a curious, numbers-savvy Johns Hopkins engineering student and his professor, it may soon get a new lease on life.

With just a few modern-day tweaks, the researchers say they've made the rarely used Jacobi method work up to 200 times faster. The result, they say, could speed up the performance of computer simulations used in aerospace design, shipbuilding, weather and climate modeling, biomechanics and other engineering tasks.

Their paper describing this updated math tool was published June 27 in the online edition of the Journal of Computational Physics. 

"For people who want to use the Jacobi method in computational mechanics, a problem that used to take 200 days to solve may now take only one day," said Rajat Mittal, a mechanical engineering professor in the university's Whiting School of Engineering and senior author of the journal article. "Our paper provides the recipe for how to speed up this method significantly by just changing four or five lines in the computer code."


Simulation data showing significantly faster reduction in solution error for the new Scheduled Relaxation Jacobi (SRJ) method as compared to the classical Jacobi and Gauss-Seidel iterative methods.The equation that is being solved here is the two-dimensional Laplace equation on a 128x128 grid.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

FOXO1: Single gene switch to convert human gastrointestinal cells to insulin-producing cells

FOXO1: Single gene switch to convert human gastrointestinal cells to insulin-producing cells | Amazing Science | Scoop.it

By switching off a single gene, scientists have converted human gastrointestinal cells into insulin-producing cells, demonstrating in principle that a drug could retrain cells inside a person’s GI tract to produce insulin. The finding raises the possibility that cells lost in type 1 diabetes may be more easily replaced through the reeducation of existing cells than through the transplantation of new cells created from embryonic or adult stem cells. The new research was reported in the online issue of the journal Nature Communications.


"People have been talking about turning one cell into another for a long time, but until now we hadn't gotten to the point of creating a fully functional insulin-producing cell by the manipulation of a single target," said the study's senior author, Domenico Accili, MD, the Russell Berrie Foundation Professor of Diabetes (in Medicine) at Columbia University Medical Center (CUMC).


The finding raises the possibility that cells lost in type 1 diabetes may be more easily replaced through the reeducation of existing cells than through the transplantation of new cells created from embryonic or adult stem cells.


For nearly two decades, researchers have been trying to make surrogate insulin-producing cells for type 1 diabetes patients. In type 1 diabetes, the body's natural insulin-producing cells are destroyed by the immune system.


Although insulin-producing cells can now be made in the lab from stem cells, these cells do not yet have all the functions of naturally occurring pancreatic beta cells.


This has led some researchers to try instead to transform existing cells in a patient into insulin-producers. Previous work by Dr. Accili's lab had shown that mouse intestinal cells can be transformed into insulin-producing cells; the current Columbia study shows that this technique also works in human cells.


The Columbia researchers were able to teach human gut cells to make insulin in response to physiological circumstances by deactivating the cells' FOXO1 gene. Accili and postdoctoral fellow Ryotaro Bouchi first created a tissue model of the human intestine with human pluripotent stem cells. Through genetic engineering, they then deactivated any functioning FOXO1 inside the intestinal cells. After seven days, some of the cells started releasing insulin and, equally important, only in response to glucose.


The team had used a comparable approach in its earlier, mouse study. In the mice, insulin made by gut cells was released into the bloodstream, worked like normal insulin, and was able to nearly normalize blood glucose levels in otherwise diabetic mice: New Approach to Treating Type I Diabetes? Columbia Scientists Transform Gut Cells into Insulin Factories. That work, which was reported in 2012 in the journal Nature Genetics, has since received independent confirmation from another group.

more...
Peter Phillips's curator insight, July 2, 2014 6:43 PM

New hope for diabetics - without a transplant.

Eric Chan Wei Chiang's curator insight, July 13, 2014 10:08 AM

These findings indicate that gastrointestinal cells and insulin producing β cells in the pancreas probably differentiated from the same line of cells during development. Insulin production in gastrointestinal cells is probably deactivated by the FOXO1 gene.

 

This opens up new possibilities as there is already a proof of concept for treating HIV with induced pluripotent stem cells. http://sco.lt/7yg3g9

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Reinterpretation of Cold Dark Matter in the Universe as a Bose-Einstein Condensate

Reinterpretation of Cold Dark Matter in the Universe as a Bose-Einstein Condensate | Amazing Science | Scoop.it

Newly published research signifies the reinterpretation of cold dark matter, opening up the possibility that it could be regarded as a very cold quantum fluid governing the formation of the structure of the Universe.


Tom Broadhurst, an Ikerbasque researcher at the UPV/EHU’s Department of Theoretical Physics, has participated alongside scientists of the National Taiwan University in a piece of research that explores cold dark matter in depth and proposes new answers about the formation of galaxies and the structure of the Universe. These predictions, published in the prestigious journal Nature Physics, are being contrasted with fresh data provided by the Hubble space telescope.


In cosmology, cold dark matter is a form of matter the particles of which move slowly in comparison with light, and interact weakly with electromagnetic radiation. It is estimated that only a minute fraction of the matter in the Universe is baryonic matter, which forms stars, planets and living organisms. The rest, comprising over 80%, is dark matter and energy.


The theory of cold dark matter helps to explain how the universe evolved from its initial state to the current distribution of galaxies and clusters, the structure of the Universe on a large scale. In any case, the theory was unable to satisfactorily explain certain observations, but the new research by Broadhurst and his colleagues sheds new light in this respect.


As the Ikerbasque researcher explained, “guided by the initial simulations of the formation of galaxies in this context, we have reinterpreted cold dark matter as a Bose-Einstein condensate”. So, “the ultra-light bosons forming the condensate share the same quantum wave function, so disturbance patterns are formed on astronomic scales in the form of large-scale waves”.


This theory can be used to suggest that all the galaxies in this context should have at their center large stationary waves of dark matter called solitons, which would explain the puzzling cores observed in common dwarf galaxies.


The research also makes it possible to predict that galaxies are formed relatively late in this context in comparison with the interpretation of standard particles of cold dark matter. The team is comparing these new predictions with observations by the Hubble space telescope.


The results are very promising as they open up the possibility that dark matter could be regarded as a very cold quantum fluid that governs the formation of the structure across the whole Universe. This research opened up fresh possibilities to conduct research into the first galaxies to emerge after the Big Bang.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Scientists have made light appear to break Newton’s third law

Scientists have made light appear to break Newton’s third law | Amazing Science | Scoop.it

Laser pulses have been made to accelerate themselves around loops of optical fibre-  which seems to go against Newton’s 3rd law. This states that for every action there is an equal and opposite reaction. This new research exploits a loophole with light that makes it appear to have mass.


Under Newton’s third law of motion, if we imagine one billiard ball striking another upon a pool table, the two balls will bounce away from each other. If one of the billiard balls had a negative mass, then the collision of the two balls would result in them accelerating in the same direction. This effect could be used in a diametric drive, where negative and positive mass interact for a continuously propulsive effect. Such a drive also relies on the assumption that negative mass has negative inertia. 


Quantum mechanics however states that matter cannot have a negative mass. Negative mass is not the same as antimatter, as even antimatter has positive mass. Negative mass is a hypothetical concept of matter where mass is of opposite sign to the mass of normal matter. Negative mass is used in speculative theories, such as the construction of wormholes. Should such matter exist, it would violate one or more energy conditions and show strange properties. No material object has ever been found that can be shown by experiment to have a negative mass.


Experimental physicist Ulf Peschel and his colleagues at the University of Erlangen-Nuremberg in Germany have now made a diametric drive using effective mass.. Photons travelling at the speed of light have no rest mass. Shining pulses of light into layered materials like crystals means some of the photons can be reflected backwards by one layer and forwards by another. This delays part of the pulse and interferes with the rest of the pulse as it passes more slowly through the material.


When a material such as layered crystals slows the speed of the light pulse in proportion to its energy, it is behaving as if it has mass. This is called effective mass, which is the mass that a particle appears to have when responding to forces. Light pulses can have a negative effective mass depending on the shape of their light waves and the structure of the crystal material that the light waves are passing through. To get a pulse to interact with material with a positive effective mass means finding a crystal that is so long that it can absorb the light before different pulses show a diametric drive effect.


Peschel therefore created a series of laser pulses in two loops of fibre-optic cable to get around these requirements. The pulses were split between the loops at a contact point and the light kept moving around each light in the same direction.

more...
Infospectives's curator insight, July 8, 2014 5:45 PM

Anyone in the market for a man-made wormhole?

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Tweet This: FDA Finally Proposes Social Media Guidelines

Tweet This: FDA Finally Proposes Social Media Guidelines | Amazing Science | Scoop.it

After several years of anticipation, the FDA has finally proposed a pair of guidelines for how drug and device makers should cope with some of the challenges and pitfalls posed by social media.


One of the so-called draft guidances offers instructions on how companies should attempt to correct product information on websites that are run by others, such as chat rooms. The other addresses how products – including risk and benefit information – can be discussed in venues such as Twitter, as well as paid search links on Google and Yahoo, all of which have limited space. This will involve using links to product web sites, for instances, that can be clicked.


“These are intended to have a beneficial impact on public health,” Tom Abrams, who heads the FDA Office of Prescription Drug Promotion, tells us. “But these were not developed in a vacuum. They were developed with careful consideration and with input from industry and many other stakeholders. There was a lot of important consideration given to the issues.”


For third-party websites, such as Wikipedia, the draft guidance suggests that companies should feel free to correct misinformation, but that any correction must include balanced information and the source of the revision or update must be noted, Abrams explains. This means a company or company employee or contractor should be credited with any additions.


“The information should not be promotional and should be factually correct. This is not an opportunity for a company to tout its drugs,” he says. “The information [being added or revised] should be consistent with the FDA-approved [product] labeling and for it to be effective, you want it posted right by the misinformation.”


The guidance also says that companies should contact writers, such as bloggers, to make changes when they learn of misinformation. Abrams notes companies will not be held responsible for those who do not make changes. If none of this is possible, he says companies should contact web site operators and suggest they delete the misinformation or open the site to comments so that corrections can be made.


The guidelines are being released nearly five years after the FDA held a well-attended public hearing to sift through Internet issues confronting drug and device makers. But the guidelines never materialized, despite repeated signals they may be forthcoming. Now, FDA officials must act before a July deadline set by a 2012 law requiring them to release guidance on product promotion on the Internet.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

New imaging method allows to see how the small intestine operates in real time

New imaging method allows to see how the small intestine operates in real time | Amazing Science | Scoop.it

“Nanojuice” could improve how doctors examine the small intestine.


Located deep in the human gut, the small intestine is not easy to examine. X-rays, MRIs and ultrasound images provide snapshots but each suffers from its own limitations.


University at Buffalo researchers are developing a new imaging technique involving nanoparticles suspended in liquid to form “nanojuice” that patients would drink. Upon reaching the small intestine, doctors would strike the nanoparticles with a harmless laser light, providing an unparalleled, non-invasive, real-time view of the organ.


Described July 6 in the journal Nature Nanotechnology, the advancement could help doctors better identify, understand and treat gastrointestinal ailments.


“Conventional imaging methods show the organ and blockages, but this method allows you to see how the small intestine operates in real time,” said corresponding author Jonathan Lovell, PhD, UB assistant professor of biomedical engineering. “Better imaging will improve our understanding of these diseases and allow doctors to more effectively care for people suffering from them.”


In laboratory experiments performed with mice, the researchers administered the nanojuice orally. They then used photoacoustic tomography (PAT), which is pulsed laser lights that generate pressure waves that, when measured, provide a real-time and more nuanced view of the small intestine.


The researchers plan to continue to refine the technique for human trials, and move into other areas of the gastrointestinal tract.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

DR5 Protein Helps Cells To Adapt—or Die

DR5 Protein Helps Cells To Adapt—or Die | Amazing Science | Scoop.it
Scientists show how cell stress both prevents and promotes cell suicide in a study that’s equally divisive.


A cellular stress pathway called the unfolded-protein-response (UPR) both activates and degrades death receptor 5 protein (DR5), which can promote or prevent cell suicide, according to a paper published inScience today (July 3). The theory is that initial stress blocks cell suicide, or apoptosis, to give the cell a chance to adapt, but that if the stress persists, it eventually triggers apoptosis.


“This work has made the most beautiful simplification of all this big complex mess. Basically, they identified and pinpointed the specific protein involved in the switching decision and explain how the decision is made,” said Alexei Korennykh, a professor of molecular biology at Princeton University, who was not involved in the work.


But Randal Kaufman of the Sanford-Burnham Medical Research Institute in La Jolla, California, was not impressed. He questioned the physiological relevance of the experiments supporting the authors’ main conclusions about this key cellular process.


Protein folding in a cell takes place largely in the endoplasmic reticulum (ER), but if the process goes awry, unfolded proteins accumulate, stressing the ER. This triggers the UPR, which shuts down translation, degrades unfolded proteins, and increases production of protein-folding machinery. If ER stress is not resolved, however, the UPR can also induce apoptosis.


Two main factors control the UPR—IRE1a and PERK. IRE1a promotes cell survival by activating the transcription factor XBP1, which drives expression of cell-survival genes. PERK, on the other hand, activates a transcription factor called CHOP, which in turn drives expression of the proapoptotic factor DR5.


Peter Walter of the University of California, San Francisco, and his colleagues have now confirmed that CHOP activates DR5, showing that it is a cell-autonomous process. But they have also found that IRE1a suppresses DR5, directly degrading its mRNA through a process called regulated IRE1a-dependent degradation (RIDD). Inhibition of IRE1a in a human cancer cell line undergoing ER stress both prevented DR5 mRNA decay and increased apoptosis.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Circulating Avian Influenza Viruses Closely Related to the 1918 Virus Have Pandemic Potential

Circulating Avian Influenza Viruses Closely Related to the 1918 Virus Have Pandemic Potential | Amazing Science | Scoop.it

Animal influenzas including bird or avian flu are thought to be the reservoir for deadly human strains like one which caused the 1918 pandemic.  Scientists have noted recently that the genes of the avian flu viruses are very similar to the 1918 strain.   Professor Yoshihiro Kawaoka of the University of Wisconsin-Madison followed this hunch by generating a strain call “1918-like” through combining genes from 8 current circulating avian flu virus strains.


Alarmingly, he found in testing that it has high potential to infect and cause transmission in humans  Further, a mere seven genetic changes is sufficient to generate a strain that is airborne transmissible.


To focus their attention on the genes that contribute mostly to the enhanced infectivity of the “1918-like” strain and the normal avian strains, the researchers went gene by gene. They created systematically strains that had only one gene from the 1918 strain against the genetic background of an otherwise typical avian influenza strain, and were able to show that the genes hemagluttinin (HA) and an RNA polymerase (PB2) are the strongest contributors to the pathogenicity of the human generated “1918-like” strain. The HA gene is what the flu virus uses to latch onto the exterior of a human cell.  The PB2 gene is what the flu uses to manufacture copies of itself.  Both were found to be more efficient in the 1918 and 1918-like strains.


At first the researchers found that the “1918-like” strain was not transmissible.  But with these two important genes in hand, they found that by adding them from the original 1918 strain give rise to transmission. This led them to try generating slight variations of “1918-like” to see whether it was quite easy to make a transmissible strain.


The researchers again focused on the HA and PB2 genes, making a very mutations.  Remarkably, one of their resulting strains, a “1918-like” with a mere 7 genetic changes across 3 genes, could infect and be transmitted their test subjects.


One bright spot in the research is that blood from people who were vaccinated against the more normal, 2009 seasonal flu strain, also react to the dangerous 1918-strains generated in Kawaoka’s laboratory, giving rise to hope that perhaps the population has some protection already. Influenza research of this type requires high level of safety procedures and precautions.  The work carried out by Kawaoka required what is called Biosafety Level 3.  This entails the use of negative pressure hoods, proper safety attire, and restricted access during experimentation.  The highest level is Biosafety Level 4 which is reserved for Ebola and other fast, deadly public health disease agents.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

CHD8: Genetic basis for a distinct type of autism uncovered

CHD8: Genetic basis for a distinct type of autism uncovered | Amazing Science | Scoop.it

A variation in the CHD8 gene has a strong likelihood of leading to a type of autism accompanied by digestive problems, a larger head and wide-set eyes. 


“We finally got a clear-cut case of an autism-specific gene,” said Raphael Bernier, University of Washington associate professor of psychiatry and behavioral sciences and clinical director of the Autism Center at Seattle Children’s. He is one of the lead authors of a Cell paper published today, “Disruptive CHD8 Mutations Define a Subtype of Autism in Early Development.” 


Scientists at 13 institutions around the world collaborated on the project.


Autism may have many genetic and other causes, and can vary in how it affects individuals. Currently autism is diagnosed based on behavioral traits.


Today’s discovery is part of an emerging approach to studying the underlying mechanisms of autism and what those mean for people with the condition. Many research teams are trying to group subtypes of autism based on genetic profiles.


The approach could uncover hundreds more genetic mutations. Genetic testing for the various forms eventually could be offered to families to guide them on what to expect and how to care for their child.  


In their study of 6,176 children with autism spectrum disorder, researchers found 15 had a CHD8 mutation. All the cases had similarities in their physical appearance as well as sleep disturbances and gastrointestinal problems. Bernier and his team interviewed all 15 of the children.


To confirm the findings, the researchers worked with scientists at Duke University who study genetically modified zebrafish, a common laboratory model for gene mutation studies. After the researchers disrupted the fish’s CHD8 gene, the small fry developed large heads and wide set eyes. They then fed the naturally semi-transparent fish fluorescent pellets to observe their digestion. They found that the fish had problems discarding food waste and were constipated.


Bernier said this is the first time researchers have shown a definitive cause of autism from a genetic mutation. Previously identified genetic conditions like Fragile X, which accounts for a greater number of autism cases, are associated with other neurological impairments, such as intellectual disability, more than with autism. Although less than half a percent of all people with autism will have this subtype, Bernier said this study has many implications.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

New to Google Earth: Ancient Flying Reptiles Database and Mapping Tool

New to Google Earth: Ancient Flying Reptiles Database and Mapping Tool | Amazing Science | Scoop.it

A newly developed website catalogs more than 1,300 specimens of extinct flying reptiles called pterosaurs, thus enabling users to map out the ancient creatures on Google Earth. The goal is to help researchers find trends in the evolution and diversity of theseancient winged reptiles.


"Having a very specific database like this, which is just for looking at individual fossil specimens of pterosaurs, is very helpful, because you can ask questions that you couldn't have answered with bigger databases [of more animals]," said Matthew McLain, a doctoral candidate in paleontology at Loma Linda University in California and one of the three developers of the site. McLain and his colleagues call their database PteroTerra


Pterosaurs were the first flying vertebrates. They lived between 228 million and 66 million years ago, and went extinct around the end of the Cretaceous period. During that time, this group evolved to be incredibly diverse. Some were tiny, like the sparrow-size Nemicolopterus crypticus, which lived 120 million years ago in what is now China. Others were simply huge, like Quetzalcoatlus, which was as tall as a giraffe and probably went around spearing little dinosaurs with its beak like a stork might snack on frogs.

Paleontological databases are common tools, because they allow researchers to navigate through descriptions of fossil specimens. One of the largest, the Paleobiology Database, has more than 50,000 individual entries.


McLain and his colleagues wanted something more targeted. They painstakingly built PteroTerra from the ground up. McLain, as the paleontologist on the project, read published papers on pterosaurs and visited museums to catalog specimens.


"I think we have every species represented, so in that sense, it's pretty complete," he told Live Science. The database does not contain every specimen of pterosaur material ever found — tens of thousands of fossil fragments have been discovered — but McLain hopes to get other paleontologists on board as administrators to upload their specimen data.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Ninety-nine percent of the ocean's plastic is missing

Ninety-nine percent of the ocean's plastic is missing | Amazing Science | Scoop.it

Millions of tons. That’s how much plastic should be floating in the world’s oceans, given our ubiquitous use of the stuff. But a new study finds that 99% of this plastic is missing. One disturbing possibility: Fish are eating it.


If that’s the case, “there is potential for this plastic to enter the global ocean food web,” says Carlos Duarte, an oceanographer at the University of Western Australia, Crawley. “And we are part of this food web.”


Humans produce almost 300 million tons of plastic each year. Most of this ends up in landfills or waste pits, but a 1970s National Academy of Sciences study estimated that 0.1% of all plastic washes into the oceans from land, carried by rivers, floods, or storms, or dumped by maritime vessels. Some of this material becomes trapped in Arctic ice and some, landing on beaches, can even turn into rocks made of plastic. But the vast majority should still be floating out there in the sea, trapped in midocean gyres—large eddies in the center of oceans, like theGreat Pacific Garbage Patch.


To figure out how much refuse is floating in those garbage patches, four ships of the Malaspina expedition, a global research project studying the oceans, fished for plastic across all five major ocean gyres in 2010 and 2011. After months of trailing fine mesh nets around the world, the vessels came up light—by a lot. Instead of the millions of tons scientists had expected, the researchers calculated the global load of ocean plastic to be about only 40,000 tons at the most, the researchers report online today in the Proceedings of the National Academy of Sciences. “We can’t account for 99% of the plastic that we have in the ocean,” says Duarte, the team’s leader.


He suspects that a lot of the missing plastic has been eaten by marine animals. When plastic is floating out on the open ocean, waves and radiation from the sun can fragment it into smaller and smaller particles, until it gets so small it begins to look like fish food—especially to small lanternfish, a widespread small marine fish known to ingest plastic.

“Yes, animals are eating it,” says oceanographer Peter Davison of the Farallon Institute for Advanced Ecosystem Research in Petaluma, California, who was not involved in the study. “That much is indisputable.”


But, he says, it’s hard to know at this time what the biological consequences are. Toxic ocean pollutants like DDT, PCBs, or mercury cling to the surface of plastics, causing them to “suck up all the pollutants in the water and concentrate them.” When animals eat the plastic, that poison could be going into the fish and traveling up the food chain to market species like tuna or swordfish. Or, Davison says, toxins in the fish “may dissolve back into the water … or for all we know they’re puking [the plastic] or pooping it out, and there’s no long-term damage. We just don’t know.”

more...
Eric Chan Wei Chiang's comment, July 8, 2014 3:55 AM
Much of the missing plastics is converted into micro plastics and some of it is consumed by wildlife http://sco.lt/70s3kn
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Overcoming light scattering: Single-pixel optical system uses compressive sensing to see deeper inside tissue

Overcoming light scattering: Single-pixel optical system uses compressive sensing to see deeper inside tissue | Amazing Science | Scoop.it

Optical imaging methods are rapidly becoming essential tools in biomedical science because they're noninvasive, fast, cost-efficient and pose no health risks since they don't use ionizing radiation. These methods could become even more valuable if researchers could find a way for optical light to penetrate all the way through the body's tissues. With today's technology, even passing through a fraction of an inch of skin is enough to scatter the light and scramble the image.

Now a team of researchers from Spain's Jaume I University (UJI) and the University of València has developed a single-pixel optical system based on compressive sensing that can overcome the fundamental limitations imposed by this scattering. The work was published today in The Optical Society's (OSA) open-access journal Optics Express.


"In the diagnostic realm within the past few years, we've witnessed the way optical imaging has helped clinicians detect and evaluate suspicious lesions," said Jesús Lancis, the paper's co-author and a researcher in the Photonics Research Group at UJI. "The elephant in the room, however, is the question of the short penetration depth of light within tissue compared to ultrasound or x-ray technologies. Current knowledge is insufficient for early detection of small lesions located deeper than a millimeter beneath the surface of the mucosa." "Our goal is to see deeper inside tissue," he added.


To achieve this, the team used an off-the-shelf digital micromirror array from a commercial video projector to create a set of microstructured light patterns that are sequentially superimposed onto a sample. They then measure the transmitted energy with a photodetector that can sense the presence or absence of light, but has no spatial resolution. Then they apply a signal processing technique called compressive sensing, which is used to compress large data files as they are measured. This allows them to reconstruct the image.


One of the most surprising aspects of the team's work is that they use essentially a single-pixel sensor to capture the images. While most people think that more pixels result in better image quality, there are some cases where this isn't true, Lancis said. In low-light imaging, for instance, it's better to integrate all available light into a single sensor. If the light is split into millions of pixels, each sensor receives a tiny fraction of light, creating noise and destroying the image.

more...
Donald Schwartz's curator insight, July 2, 2014 7:33 PM

At least a step in the other direction.

Scooped by Dr. Stefan Gruenwald
Scoop.it!

New Species of Beetle Discovered in World's Deepest Cave

New Species of Beetle Discovered in World's Deepest Cave | Amazing Science | Scoop.it

We've been to the moon but we still haven't discovered everything on our own planet yet. An expedition to world’s deepest cave, Krubera-Voronja in Western Caucasus, revealed an interesting subterranean community, living below 2000 meters and represented by more than 12 species of arthropods, including several new species for science. This deep cave biota is composed of troglobionts and also epigean species, that can penetrate until -2140 m. The distance from the base of the Krubera-Voronja system to the top is about the same as the height of seven Eiffel Towers. Ambient temperatures are constantly below seven degrees Celsius and it gets considerably colder the lower you descend. Water temperature is just above freezing.


The biocoenosis and the vertical distribution of invertebrate fauna of Krubera-Voronja are provided, from its entrance to the remarkable depth of 2140 meters, including the discovery of world’s deepest dwelling arthropod.


A new species of ground beetle named Duvalius abyssimus—was recently discovered by scientists exploring the subterranean fauna living up to 1.5 miles below the earth's surface in Krubera-Voronja. The new creature has adapted to a life without light in the world's deepest cave system, with extended antennae and a body that has no pigment. It is about a quarter of an inch long.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Dramatic decline of Caribbean coral reefs: Most corals may disappear within the next 20 years

Dramatic decline of Caribbean coral reefs: Most corals may disappear within the next 20 years | Amazing Science | Scoop.it

With only about one-sixth of the original coral cover left, most Caribbean coral reefs may disappear in the next 20 years, primarily due to the loss of grazers in the region, according to the latest report by the Global Coral Reef Monitoring Network (GCRMN), the International Union for Conservation of Nature (IUCN) and the United Nations Environment Programme (UNEP).


The report, Status and Trends of Caribbean Coral Reefs: 1970-2012, is the most detailed and comprehensive study of its kind published to date – the result of the work of 90 experts over the course of three years. It contains the analysis of more than 35,000 surveys conducted at 90 Caribbean locations since 1970, including studies of corals, seaweeds, grazing sea urchins and fish.


The results show that the Caribbean corals have declined by more than 50% since the 1970s. But according to the authors, restoring parrotfish populations and improving other management strategies, such as protection from overfishing and excessive coastal pollution, could help the reefs recover and make them more resilient to future climate change impacts.


“The rate at which the Caribbean corals have been declining is truly alarming,” says Carl Gustaf Lundin, Director of IUCN’s Global Marine and Polar Programme. “But this study brings some very encouraging news: the fate of Caribbean corals is not beyond our control and there are some very concrete steps that we can take to help them recover.”


Climate change has long been thought to be the main culprit in coral degradation. While it does pose a serious threat by making oceans more acidic and causing coral bleaching, the report shows that the loss of parrotfish and sea urchin – the area’s two main grazers – has, in fact, been the key driver of coral decline in the region. An unidentified disease led to a mass mortality of the sea urchin in 1983 and extreme fishing throughout the 20th century has brought the parrotfish population to the brink of extinction in some regions. The loss of these species breaks the delicate balance of coral ecosystems and allows algae, on which they feed, to smother the reefs.

more...
Peter Phillips's curator insight, July 2, 2014 6:27 PM

Scientists have identified the loss of grazers (parrot fish and sea urchins) as the main reason behind the decline in reef health in the Caribbean. The disruption to the reef ecosystem is now understood to be more important than climate change and ocean acidification to the resilience of coral reefs. Overfishing and a disease which affected sea urchins lead to algal growth which smothers coral.