Amazing Science
Follow
360.8K views | +140 today
 
Scooped by Dr. Stefan Gruenwald
onto Amazing Science
Scoop.it!

A Strange Computer Promises Great Speed

A Strange Computer Promises Great Speed | Amazing Science | Scoop.it
Lockheed Martin will make commercial use of quantum computing, which could solve some business and science problems millions of times faster than can be done today.

 

Our digital age is all about bits, those precise ones and zeros that are the stuff of modern computer code.  But a powerful new type of computer that is about to be commercially deployed by a major American military contractor is taking computing into the strange, subatomic realm of quantum mechanics. In that infinitesimal neighborhood, common sense logic no longer seems to apply. A one can be a one, or it can be a one and a zero and everything in between — all at the same time.

 

It sounds preposterous, particularly to those familiar with the yes/no world of conventional computing. But academic researchers and scientists at companies like Microsoft, I.B.M. and Hewlett-Packard have been working to develop quantum computers. Now, Lockheed Martin — which bought an early version of such a computer from the Canadian company D-Wave Systems two years ago — is confident enough in the technology to upgrade it to commercial scale, becoming the first company to use quantum computing as part of its business. Skeptics say that D-Wave has yet to prove to outside scientists that it has solved the myriad challenges involved in quantum computation.


But if it performs as Lockheed and D-Wave expect, the design could be used to supercharge even the most powerful systems, solving some science and business problems millions of times faster than can be done today.

 

Ray Johnson, Lockheed’s chief technical officer, said his company would use the quantum computer to create and test complex radar, space and aircraft systems. It could be possible, for example, to tell instantly how the millions of lines of software running a network of satellites would react to a solar burst or a pulse from a nuclear explosion — something that can now take weeks, if ever, to determine.

 “This is a revolution not unlike the early days of computing,” he said. “It is a transformation in the way computers are thought about.” Many others could find applications for D-Wave’s computers. Cancer researchers see a potential to move rapidly through vast amounts of genetic data. The technology could also be used to determine the behavior of proteins in the human genome, a bigger and tougher problem than sequencing the genome. Researchers at Google have worked with D-Wave on using quantum computers to recognize cars and landmarks, a critical step in managing self-driving vehicles.


Quantum computing is so much faster than traditional computing because of the unusual properties of particles at the smallest level. Instead of the precision of ones and zeros that have been used to represent data since the earliest days of computers, quantum computing relies on the fact that subatomic particles inhabit a range of states. Different relationships among the particles may coexist, as well. Those probable states can be narrowed to determine an optimal outcome among a near-infinitude of possibilities, which allows certain types of problems to be solved rapidly.

 

D-Wave, a 12-year-old company based in Vancouver, has received investments from Jeff Bezos, the founder of Amazon.com, which operates one of the world’s largest computer systems, as well as from the investment bank Goldman Sachs and from In-Q-Tel, an investment firm with close ties to the Central Intelligence Agency and other government agencies.


“What we’re doing is a parallel development to the kind of computing we’ve had for the past 70 years,” said Vern Brownell, D-Wave’s chief executive.

Mr. Brownell, who joined D-Wave in 2009, was until 2000 the chief technical officer at Goldman Sachs. “In those days, we had 50,000 servers just doing simulations” to figure out trading strategies, he said. “I’m sure there is a lot more than that now, but we’ll be able to do that with one machine, for far less money.”


more...
No comment yet.
Your new post is loading...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

20,000+ FREE Online Science and Technology Lectures from Top Universities

20,000+ FREE Online Science and Technology Lectures from Top Universities | Amazing Science | Scoop.it

NOTE: To subscribe to the RSS feed of Amazing Science, copy http://www.scoop.it/t/amazing-science/rss.xml into the URL field of your browser and click "subscribe".

 

This newsletter is aggregated from over 1450 news sources:

http://www.genautica.com/links/1450_news_sources.html

 

All my Tweets and Scoop.It! posts sorted and searchable:

http://www.genautica.com/tweets/index.html

 

••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••

You can search through all the articles semantically on my

archived twitter feed

••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••

NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen)  and display all the relevant postings SORTED by TOPICS.

 

You can also type your own query:

 

e.g., you are looking for articles involving "dna" as a keyword

 

http://www.scoop.it/t/amazing-science/?q=dna


Or CLICK on the little

FUNNEL symbol at the

 top right of the screen

••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••


MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video 

more...
Margarida Sá Costa's curator insight, January 31, 9:55 AM

Lectures are in Playlists and are alphabetically sorted with thumbnail pictures. No fee, no registration required - learn at your own pace. Certificates can be arranged with presenting universities.

Casper Pieters's curator insight, March 9, 7:21 PM

Great resources for online learning just about everything.  All you need is will power and self- discipline.

Siegfried Holle's curator insight, July 4, 8:45 AM

Your knowledge is your strength and power 

Scooped by Dr. Stefan Gruenwald
Scoop.it!

New solar power material converts 90 percent of captured light into heat

New solar power material converts 90 percent of captured light into heat | Amazing Science | Scoop.it

A multidisciplinary engineering team at the University of California, San Diego developed a new nanoparticle-based material for concentrating solar power plants designed to absorb and convert to heat more than 90 percent of the sunlight it captures. The new material can also withstand temperatures greater than 700 degrees Celsius and survive many years outdoors in spite of exposure to air and humidity. Their work, funded by the U.S. Department of Energy's SunShot program, was published recently in two separate articles in the journal Nano Energy.


By contrast, current solar absorber material functions at lower temperatures and needs to be overhauled almost every year for high temperature operations. "We wanted to create a material that absorbs sunlight that doesn't let any of it escape. We want the black hole of sunlight," said Sungho Jin, a professor in the department of Mechanical and Aerospace Engineering at UC San Diego Jacobs School of Engineering. Jin, along with professor Zhaowei Liu of the department of Electrical and Computer Engineering, and Mechanical Engineering professor Renkun Chen, developed the Silicon boride-coated nanoshell material. They are all experts in functional materials engineering.


The novel material features a "multiscale" surface created by using particles of many sizes ranging from 10 nanometers to 10 micrometers. The multiscale structures can trap and absorb light which contributes to the material's high efficiency when operated at higher temperatures.


Concentrating solar power (CSP) is an emerging alternative clean energy market that produces approximately 3.5 gigawatts worth of power at power plants around the globe—enough to power more than 2 million homes, with additional construction in progress to provide as much as 20 gigawatts of power in coming years. One of the technology's attractions is that it can be used to retrofit existing power plants that use coal or fossil fuels because it uses the same process to generate electricity from steam.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Google X project plans to use magnetic nanoparticles and wearable sensor to detect diseases

Google X project plans to use magnetic nanoparticles and wearable sensor to detect diseases | Amazing Science | Scoop.it

Google announced a new “Nanoparticle Platform” project Tuesday to develop medical diagnostic technology using nanoparticles, Andrew Conrad, head of the Google X Life Sciences team, disclosed at The Wall Street Journal’s WSJD Live conference. The idea is to use nanoparticles with magnetic cores circulating in the bloodstream with recognition molecules to detect cancer, plaques, or too much sodium, for example.


There are a number of similar research projects using magnetic (and other) nanoparticles in progress, as reported onKurzweilAI. What’s new in the Google project is delivering nanoparticles to the bloodstream via a pill and using a wearable wrist detector to detect the nanoparticles’ magnetic field and read out diagnostic results.


But this is an ambitious moonshot project. “Google is at least five to seven years away from a product approved for use by doctors,” said Sam Gambhir, chairman of radiology at Stanford University Medical School, who has been advising Dr. Conrad on the project for more than a year, the WSJ reports.


“Even if Google can make the system work, it wouldn’t immediately be clear how to interpret the results. That is why Dr. Conrad’s team started the Baseline study [see “New Google X Project to look for disease and health patterns in collected data”], which he hopes will create a benchmark for comparisons.”

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Lights out: study shows urgent need to address instability of world's power grid

Lights out: study shows urgent need to address instability of world's power grid | Amazing Science | Scoop.it

Research by Hugh Byrd, Professor of Architecture at the University of Lincoln, UK, and Steve Matthewman, Associate Professor of Sociology at the University of Auckland, New Zealand, highlights the insecurities of power systems and weakening electrical infrastructure across the globe, particularly in built-up urban areas.

The work builds on previous studies which examined a sharp increase in electrical usage over recent years, and warned the world to prepare for the prospect of coping without electricity as instances of complete power failure become increasingly common.

Professor Byrd explained: “We have previously highlighted that demand for new technology continues to grow at an unprecedented rate. Our new research emphasizes why energy sources are becoming increasingly inadequate, and simply cannot continue to meet this demand.

“Throughout our study, we observed a number of network failures due to inadequate energy, whether through depletion of resources such as oil and coal, or through the vagaries of the climate in the creation of renewable energy.”

The British energy regulator Ofgem has predicted a fall in spare electrical power production capacity to two per cent by 2015, meaning there is now even less flexibility of supply to adjust to spikes in demand. 

The issue of energy security exists for countries which have access to significant renewable power supplies too. With rain, wind and sunshine becoming less predictable due to changes brought about by global warming, the new research found that severe blackouts in Kenya, India, Tanzania and Venezuela, which all occurred during the last decade, were caused by shortages of rain in hyrdro-dams.

Further to the irregularities involved in renewable power generation, the study concludes that worldwide electricity supply will also become increasingly precarious due to industry privatization and neglect of infrastructure.

Professor Matthewman said: “Over the past two decades, deregulation and privatization have become major global trends within the electrical power industry. In a competitive environment, reliability and profits may be at cross-purposes — single corporations can put their own interests ahead of the shared grid, and spare capacity is reduced in the name of cost saving. There is broad consensus among energy specialists, national advisory bodies, the reinsurance industry, and organizational sociologists that this has exacerbated blackout risk.”

These trends have seen the separation of power generation, transmission and distribution services – a process which Professors Byrd and Matthewman suggest only opens up more opportunity for electrical disruption. Their study reveals the difficulties that arise when different technical and human systems need to communicate, and points to a breakdown in this type of communication as the main  cause behind two of the world’s worst ever blackouts – from Ohio, USA, to Ontario, Canada, in 2003; and across Italy and neighboring nations in the same year. Together, these power failures affected more than 100 million people.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

San Diego company develops 10-minute $10 Ebola test

San Diego company develops 10-minute $10 Ebola test | Amazing Science | Scoop.it

With a single prick and a single drop of blood, a San Diego company claims they can now detect if a patient has Ebola in less than 10 minutes. The breakthrough technology is called “Ebola Plus,” a tool that can be used to detect Ebola on anyone, anywhere in the world.


“We can do that for a large number of tests simultaneously with just one drop of blood,” said Dr. Cary Gunn, Ph.D. and CEO and Genalyte. Once blood is drawn, a silicon chip is used to detect the virus as blood flows over it.


Researchers at Genalyte have been working on the diagnostic tool for seven years, using it to test for various diseases, and only recently discovered it could also work to spot Ebola. “It allows you to screen more patients more rapidly. The biggest question right now is the debate about quarantine.


Instead of asking people to take their fever once or twice a day, they can just take a prick of blood,” said Dr. Gunn. It can analyze up to 100 samples per hour, and be administered anywhere including, hospitals, airports, and even remote areas in West Africa where the disease is spreading rapidly. “Right now, most people in Liberia aren’t even being tested. People who have suspicion of having Ebola are being checked into wards. The ability to take a prick of blood and do the test would be a game changer in that environment,” said Gunn.


Developing the platform for the test cost Genalyte around $100,000, but each chip that will be used during the tests costs $10 each – making early detection cheaper and easier for caretakers. Currently, the FDA has only approved for P-C-R that can take two hours for results, compared to the Ebola Plus that can provide results in ten minutes.

more...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Rapid Evolution of Anole Populations in Real Time

Rapid Evolution of Anole Populations in Real Time | Amazing Science | Scoop.it

On islands off the coast of Florida, scientists uncover swift adaptive changes among Carolina anole populations, whose habitats were disturbed by the introduction of another lizard species.


For most of its existence, the Carolina anole (Anolis carolinensis) was the only lizard in the southwestern U.S. It could perch where it wanted, eat what it liked. But in the 1970s, aided by human pet trade, the brown anole (Anolis sagrei)—native to Cuba and the Bahamas—came marching in. In experiments on islands off the coast of Florida, scientists studying the effects of the species mixing witnessed evolution in action: the Carolina anole started perching higher up in trees, and its toe pads changed to enable better grip—all in a matter of 15 years, or about 20 lizard generations.


In a paper published in Science today (October 23),Yoel Stuart of the University of Texas at Austin, Todd Campbell from the University of Tampa, Florida, and their colleagues discuss what happened when the two species converged upon the same habitats.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Fact or Fiction?: Mammoths Can Be Brought Back from Extinction

Fact or Fiction?: Mammoths Can Be Brought Back from Extinction | Amazing Science | Scoop.it

In a petri dish in the bowels of Harvard Medical School scientists have tweaked three genes from the cells of an Asian elephant that help control the production of hemoglobin, the protein in blood that carries oxygen. Their goal is to make these genes more like those of an animal that last walked the planet thousands of years ago: the woolly mammoth.

"Asian elephants are closer to mammoths than either is to African elephants, yet quite different in appearance and temperature range," notes Harvard geneticist and technology developer George Church. "We are not trying to make an exact copy of a mammoth, but rather a cold-resistant elephant."
 
But what if the new—and fast advancing—techniques of genome editing allowed scientists to engineer not only cold-resistance traits but also other characteristics of the woolly mammoth into its living Asiatic relatives? Scientists have found mammoth cells preserved in permafrost. If they were to recover cells with intact DNA, they could theoretically “edit” an Asian elephant’s genome to match the woolly mammoth’s. A single cell contains the complete genetic instruction set for its species, and by replicating that via editing a new individual can, theoretically, be created. But wouldsuch a hybrid—scion of an Asian elephant mother and genetic tinkerers—count as a true woolly mammoth?
 
In other words, is de-extinction a real possibility?
 
The answer is yes. On January 6, 2000, a falling tree killed the last bucardo, a wild Iberian ibex, which is a goatlike animal. Her name was Celia. On July 30, 2003, Celia's clone was born. To make the clone scientists removed the nucleus of a cell from Celia intact and inserted it into the unfertilized egg cell of another kind of ibex. They then transferred the resulting embryo to the womb of a living goat. Nearly a year later theydelivered the clone by cutting her from her mother.
 
Although she lived for a scant seven minutes due to lung defects, Celia’s clone proved that not only is de-extinction real, "it has already happened," in the words of environmentalist Stewart Brand, whose San Francisco-based Long Now Foundation is funding some of this de-extinction research, including Church's effort as well as bids to bring back the passenger pigeon and heath hen, among other candidate species. Nor is the bucardo alone in the annals of de-extinction. Several viruses have already been brought back, including the flu variant responsible for the 1918 pandemic that killed more than 20 million people worldwide.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

New compounds decrease inflammation associated with ulcerative colitis, arthritis and multiple sclerosis

Six Case Western Reserve scientists are part of an international team that has discovered two compounds that show promise in decreasing inflammation associated with diseases such as ulcerative colitis, arthritis and multiple sclerosis. The compounds, dubbed OD36 and OD38, specifically appear to curtail inflammation-triggering signals from RIPK2 (serine/threonine/tyrosine kinase 2). RIPK2 is an enzyme that activates high-energy molecules to prompt the immune system to respond with inflammation. The findings of this research appear in the Journal of Biological Chemistry.

“This is the first published indication that blocking RIPK2 might be efficacious in inflammatory disease,” said senior author Derek Abbott, MD, PhD, associate professor of pathology, Case Western Reserve University School of Medicine. “Our data provides a strong rationale for further development and optimization of RIPK2-targeted pharmaceuticals and diagnostics.”

In addition to Abbott and his medical school colleagues, the research team included representatives of Oncodesign, a therapeutic molecule biotechnology company in Dijon, France; Janssen Research & Development, a New Jersey-based pharmaceutical company; and Asclepia Outsourcing Solutions, a Belgium-based medicinal chemistry company.

The normal function of RIPK2 is to send warning signals to cells that bacterial infection has occurred, which in turn spurs the body to mobilize white blood cells. The white blood cells identify and encircle pathogens, which cause blood to accumulate in the region. It is this blood build-up that leads to the red and swollen areas characteristic of inflammation. When this process goes awry, the inflammation increases dramatically and tissue destruction ensues. RIPK2 works in conjunction with NOD1 and NOD2 (nucleotide-binding oligomerization domain) proteins in controlling responses by the immune system that lead to this inflammation process.

In this research project, investigators applied state-of-the-art genetic sequencing to learn the unique set of genes driven specifically by NOD2 proteins. They ultimately zeroed in on three specific NOD2-driven inflammation genes (SLC26a, MARCKSL1, and RASGRP1) that guided investigators in finding the most effective compounds.

Oncodesign searched its library of 4,000 compounds that targeted kinases, and after exhaustive study, narrowed the selection down to 13. Then investigators tested the 13 compounds in mouse and human cells and found that two compounds, OD36 and OD38, were most effective in blocking RIPK2. 

“Based on the design of OD36 and OD38, we have developed with Oncodesign fifth-generation compounds that are even more effective than the first-generation OD36 and OD38,” Abbott said. “Our next step is to seek a larger pharmaceutical company that can move these compounds forward into Phase 1 clinical trials in humans.”

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Faster switching helps ferroelectrics become viable replacement for transistors

Faster switching helps ferroelectrics become viable replacement for transistors | Amazing Science | Scoop.it

Ferroelectric materials – commonly used in transit cards, gas grill igniters, video game memory and more – could become strong candidates for use in next-generation computers, thanks to new research led by scientists at the University of California, Berkeley, and the University of Pennsylvania.


The researchers found an easy way to improve the performance of ferroelectric materials in a way that makes them viable candidates for low-power computing and electronics. They described their work in a study published today (Sunday, Oct. 26) in the journal Nature Materials.


Ferroelectric materials have spontaneous polarization as a result of small shifts of negative and positive charges within the material. A key characteristic of these materials is that the polarization can be reversed in response to an electric field, enabling the creation of a “0” or “1” data bit for memory applications. Ferroelectrics can also produce an electric charge in response to physical force, such as being pressed, squeezed or stretched, which is why they are found in applications such as push-button igniters on portable gas grills.


“What we discovered was a fundamentally new and unexpected way for these ferroelectric materials to respond to applied electric fields,” said study principal investigator Lane Martin, UC Berkeley associate professor of materials science and engineering. “Our discovery opens up the possibility for faster switching and new control over novel, never-before-expected multi-state devices.”


Martin and other UC Berkeley researchers partnered with a team led by Andrew Rappe, University of Pennsylvania professor of chemistry and of materials science and engineering. UC Berkeley graduate student Ruijuan Xu led the study’s experimental design, and Penn graduate student Shi Liu led the study’s theoretical modeling.


Scientists have turned to ferroelectrics as an alternative form of data storage and memory because the material holds a number of advantages over conventional semiconductors. For example, anyone who has ever lost unsaved computer data after power is unexpectedly interrupted knows that today’s transistors need electricity to maintain their “on” or “off” state in an electronic circuit.


Because ferroelectrics are non-volatile, they can remain in one polarized state or another without power. This ability of ferroelectric materials to store memory without continuous power makes them useful for transit cards, such as the Clipper cards used to pay fare in the Bay Area, and in certain memory cards for consumer electronics. If used in next-generation computers, ferroelectrics would enable the retention of information so that data would be there if electricity goes out and then is restored.


“If we could integrate these materials into the next generation of computers, people wouldn’t lose their data if the power goes off,” said Martin, who is also a faculty scientist at the Lawrence Berkeley National Laboratory. “For an individual, losing unsaved work is an inconvenience, but for large companies like eBay, Google and Amazon, losing data is a significant loss of revenue.”


So what has held ferroelectrics back from wider use as on/off switches in integrated circuits? The answer is speed, according to the study authors.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Scientists discover principle how microbes build powerful antibiotics

Scientists discover principle how microbes build powerful antibiotics | Amazing Science | Scoop.it

Researchers report in the journal Nature that they have made a breakthrough in understanding how a powerful antibiotic agent is made in nature. Their discovery solves a decades-old mystery, and opens up new avenues of research into thousands of similar molecules, many of which are likely to be medically useful.  


The team focused on a class of compounds that includes dozens with antibiotic properties. The most famous of these is nisin, a natural product in milk that can be synthesized in the lab and is added to foods as a preservative. Nisin has been used to combat food-borne pathogens since the late 1960s.


Researchers have long known the sequence of the nisin gene, and they can assemble the chain of amino acids (called a peptide) that are encoded by this gene. But the peptide undergoes several modifications in the cell after it is made, changes that give it its final form and function. Researchers have tried for more than 25 years to understand how these changes occur.


“Peptides are a little bit like spaghetti; they’re too flexible to do their jobs,” said University of Illinois chemistry professor Wilfred van der Donk, who led the research with biochemistryprofessor Satish K. Nair. “So what nature does is it starts putting knobs in, or starts making the peptide cyclical.”


Special enzymes do this work. For nisin, an enzyme called a dehydratase removes water to help give the antibiotic its final, three-dimensional shape. This is the first step in converting the spaghetti-like peptide into a five-ringed structure, van der Donk said.

The rings are essential to nisin’s antibiotic function: Two of them disrupt the construction of bacterial cell walls, while the other three punch holes in bacterial membranes. This dual action is especially effective, making it much more difficult for microbes to evolve resistance to the antibiotic.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Stone Age tools: Innovation was local, not imported, in Eurasia more than 300,000 years ago

Stone Age tools: Innovation was local, not imported, in Eurasia more than 300,000 years ago | Amazing Science | Scoop.it

The analysis of artifacts from a 325,000-year-old site in Armenia shows that human technological innovation occurred intermittently throughout the Old World, rather than spreading from a single point of origin, as previously thought.


The study, published today in the journal Science, examines thousands of stone artifacts retrieved from Nor Geghi 1, a unique site preserved between two lava flows dated to 200,000-400,000 years ago. Layers of floodplain sediments and an ancient soil found between these lava flows contain the archaeological material. The dating of volcanic ash found within the sediments and detailed study of the sediments themselves allowed researchers to correlate the stone tools with a period between 325,000 and 335,000 years ago when Earth's climate was similar to today's.


The stone tools provide early evidence for the simultaneous use of two distinct technologies: biface technology, commonly associated with hand axe production during the Lower Paleolithic, and Levallois technology, a stone tool production method typically attributed to the Middle Stone Age in Africa and the Middle Paleolithic in Eurasia. Traditionally, Archaeologists use the development of Levallois technology and the disappearance of biface technology to mark the transition from the Lower to the Middle Paleolithic roughly 300,000 years ago.


Archaeologists have argued that Levallois technology was invented in Africa and spread to Eurasia with expanding human populations, replacing local biface technologies in the process. This theory draws a link between populations and technologies and thus equates technological change with demographic change. The co-existence of the two technologies at Nor Geghi 1 provides the first clear evidence that local populations developed Levallois technology out of existing biface technology.


"The combination of these different technologies in one place suggests to us that, about 325,000 years ago, people at the site were innovative," says Daniel Adler, associate professor of Anthropology at the University of Connecticut, and the study's lead author. Moreover, the chemical analysis of several hundred obsidian artifacts shows that humans at the site utilized obsidian outcrops from as far away as 120 kilometers (approximately 75 miles), suggesting they must also have been capable of exploiting large, environmentally diverse territories.


more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

New study strengthens link between Arctic sea-ice loss and extreme winters

New study strengthens link between Arctic sea-ice loss and extreme winters | Amazing Science | Scoop.it

Declining Arctic sea-ice has made severe winters across central Asia twice as likely, new research shows. The paper is the latest in a series linking very cold winters in the northern hemisphere to rapidly increasing temperatures in the Arctic. But the long-term picture suggests these cold winters might only be a temporary feature before further warming takes hold.


Temperatures in the Arctic are increasing almost twice as fast as the global average. This is known as  Arctic amplification. As Arctic sea-ice shrinks, energy from the sun that would have been reflected away by sea-ice is instead absorbed by the ocean.


Arctic amplification has been linked with very cold winters in mid-latitude regions of the northern hemisphere. The UK, the US and Canada have all experienced extreme winters in recent years. Just last year, for example, the UK had its second-coldest March since records began, prompting the Met Office to call a  rapid response meeting of experts to get to grips with whether melting Arctic sea-ice could be affecting British weather.


The new study, published in Nature Geoscience, suggests the likelihood of severe winters in central Asia has doubled over the past decade. This vast region includes southern Russia, Mongolia, Kazakhstan, and northern China. And it's the Arctic that's driving the changes once again, the authors say.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Droplets made to order

Droplets made to order | Amazing Science | Scoop.it
New method allows microdroplets of any shape to form on a surface.


Understanding liquid dynamics on surfaces can provide insight into nature’s design and enable fine manipulation capability in biological, manufacturing, microfluidic and thermal management applications. Of particular interest is the ability to control the shape of the droplet contact area on the surface, which is typically circular on a smooth homogeneous surface. A research team now shows the ability to tailor various droplet contact area shapes ranging from squares, rectangles, hexagons, octagons, to dodecagons via the design of the structure or chemical heterogeneity on the surface. They simultaneously obtain the necessary physical insights to develop a universal model for the three-dimensional droplet shape by characterizing the droplet side and top profiles. Furthermore, arrays of droplets with controlled shapes and high spatial resolution can be achieved using this approach. This liquid-based patterning strategy promises low-cost fabrication of integrated circuits, conductive patterns and bio-microarrays for high-density information storage and miniaturized biochips and biosensors, among others.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Turning loss into gain: Cutting power could dramatically boost laser output

Turning loss into gain: Cutting power could dramatically boost laser output | Amazing Science | Scoop.it
Lasers – devices that deliver beams of highly organized light – are so deeply integrated into modern technology that their basic operations would seem well understood. CD players, medical diagnostics and military surveillance all depend on lasers.



Re-examining longstanding beliefs about the physics of these devices, Princeton engineers have now shown that carefully restricting the delivery of power to certain areas within a laser could boost its output by many orders of magnitude. The finding, published Oct. 26 in the journal Nature Photonics, could allow far more sensitive and energy-efficient lasers, as well as potentially more control over the frequencies and spatial pattern of light emission.


"It's as though you are using loss to your advantage," said graduate student Omer Malik, an author of the study along with Li Ge, now an assistant professor at the City University of New York, and Hakan Tureci, assistant professor of electrical engineering at Princeton. The researchers said that restricting the delivery of power causes much of the physical space within a laser to absorb rather than produce light. In exchange, however, the optimally efficient portion of the laser is freed from competition with less efficient portions and shines forth far more brightly than previous estimates had suggested.


The results, based on mathematical calculations and computer simulations, still need to be verified in experiments with actual lasers, but the researchers said it represents a new understanding of the fundamental processes that govern how lasers produce light.

"Distributing gain and loss within the material is a higher level of design – a new tool – that had not been used very systematically until now," Tureci said.


The heart of a laser is a material that emits light when energy is supplied to it. When a low level of energy is added, the light is "incoherent," essentially meaning that it contains a mix of wavelengths (or colors). As more energy is added, the material suddenly reaches a "lasing" threshold when it emits coherent light of a particular wavelength.


The entire surface of the material does not emit laser light; rather, if the material is arranged as a disc, for example, the light might come from a ring close to the edge. As even more energy is added, more patterns emerge – for example a ring closer to the center might reach the laser threshold. These patterns – called modes – begin to interact and sap energy from each other. Because of this competition, subsequent modes requiring higher energy may never reach their lasing thresholds. However, Tureci's research group found that some of these higher threshold modes were potentially far more efficient than the earlier ones if they could just be allowed to function without competition.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

The top 100 papers: NATURE magazine explores the most-cited research papers of all time

The top 100 papers: NATURE magazine explores the most-cited research papers of all time | Amazing Science | Scoop.it

The discovery of high-temperature superconductors, the determination of DNA’s double-helix structure, the first observations that the expansion of the Universe is accelerating — all of these breakthroughs won Nobel prizes and international acclaim. Yet none of the papers that announced them comes anywhere close to ranking among the 100 most highly cited papers of all time.


Citations, in which one paper refers to earlier works, are the standard means by which authors acknowledge the source of their methods, ideas and findings, and are often used as a rough measure of a paper’s importance. Fifty years ago, Eugene Garfield published the Science Citation Index (SCI), the first systematic effort to track citations in the scientific literature. To mark the anniversary, Nature asked Thomson Reuters, which now owns the SCI, to list the 100 most highly cited papers of all time. (See the full list at Web of Science Top 100.xls or the interactive graphic, below.) The search covered all of Thomson Reuter’s Web of Science, an online version of the SCI that also includes databases covering the social sciences, arts and humanities, conference proceedings and some books. It lists papers published from 1900 to the present day.


The exercise revealed some surprises, not least that it takes a staggering 12,119 citations to rank in the top 100 — and that many of the world’s most famous papers do not make the cut. A few that do, such as the first observation1 of carbon nanotubes (number 36) are indeed classic discoveries. But the vast majority describe experimental methods or software that have become essential in their fields.


The most cited work in history, for example, is a 1951 paper2 describing an assay to determine the amount of protein in a solution. It has now gathered more than 305,000 citations — a recognition that always puzzled its lead author, the late US biochemist Oliver Lowry.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

DARPA amplifier circuit achieves speeds of 1 trillion Hz, enters Guinness World Records

DARPA amplifier circuit achieves speeds of 1 trillion Hz, enters Guinness World Records | Amazing Science | Scoop.it

Officials from Guinness World Records have recognized DARPA’s Terahertz Electronics program for creating the fastest solid-state amplifier integrated circuit ever measured: one terahertz (1012 GHz), or one trillion cycles per second — 150 billion cycles faster than the existing world record set in 2012.


“This breakthrough could lead to revolutionary technologies such as high-resolution security imaging systems, improved collision-avoidance radar, communications networks with many times the capacity of current systems, and spectrometers that could detect potentially dangerous chemicals and explosives with much greater sensitivity,” said Dev Palmer, DARPA program manager.


Developed by Northrop Grumman Corporation, the Terahertz Monolithic Integrated Circuit (TMIC) exhibits power gains (amplification) several orders of magnitude beyond the current state of the art by using a super-scaled 25 nanometer gate-length indium phosphide high electron mobility transistor.


The TMIC showed a measured gain (on the logarithmic scale) of nine decibels at 1.0 terahertz and eight decibels at 1.03 terahertz. “Nine decibels of gain is unheard of at terahertz frequencies” said Palmer. “This opens up new possibilities for building terahertz radio circuits.”


By contrast, current smartphone technology operates at one to two gigahertz and wireless networks at 5.7 gigahertz.


For years, researchers have been looking to exploit the high-frequency sub-millimeter-wave spectrum beginning above 300 gigahertz. Current electronics using solid-state technologies have largely been unable to access the sub-millimeter band of the electromagnetic spectrum due to insufficient transistor performance.


To address the “terahertz gap,” engineers have traditionally used frequency conversion—converting alternating current at one frequency to alternating current at another frequency—to multiply circuit operating frequencies up from millimeter-wave frequencies.


This approach, however, restricts the output power of electrical devices and adversely affects signal-to-noise ratio. Frequency conversion also increases device size, weight and power supply requirements.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Making designer mutants in all kinds of model organisms

Making designer mutants in all kinds of model organisms | Amazing Science | Scoop.it

Recent advances in the targeted modification of complex eukaryotic genomes have unlocked a new era of genome engineering. From the pioneering work using zinc-finger nucleases (ZFNs), to the advent of the versatile and specific TALEN systems, and most recently the highly accessible CRISPR/Cas9 systems, we now possess an unprecedented ability to analyze developmental processes using sophisticated designer genetic tools. Excitingly, these robust and simple genomic engineering tools also promise to revolutionize developmental studies using less well established experimental organisms.


Modern developmental biology was born out of the fruitful marriage between traditional embryology and genetics. Genetic tools, together with advanced microscopy techniques, serve as the most fundamental means for developmental biologists to elucidate the logistics and the molecular control of growth, differentiation and morphogenesis. For this reason, model organisms with sophisticated and comprehensive genetic tools have been highly favored for developmental studies. Advances made in developmental biology using these genetically amenable models have been well recognized. The Nobel prize in Physiology or Medicine was awarded in 1995 to Edward B. Lewis, Christiane Nüsslein-Volhard and Eric F. Wieschaus for their discoveries on the ‘Genetic control of early structural development’ usingDrosophila melanogaster, and again in 2002 to John Sulston, Robert Horvitz and Sydney Brenner for their discoveries of ‘Genetic regulation of development and programmed cell death’ using the nematode worm Caenorhabditis elegans. These fly and worm systems remain powerful and popular models for invertebrate development studies, while zebrafish (Danio rerio), the dual frog species Xenopus laevis and Xenopus tropicalis, rat (Rattus norvegicus), and particularly mouse (Mus musculus) represent the most commonly used vertebrate model systems. To date, random or semi-random mutagenesis (‘forward genetic’) approaches have been extraordinarily successful at advancing the use of these model organisms in developmental studies. With the advent of reference genomic data, however, sequence-specific genomic engineering tools (‘reverse genetics’) enable targeted manipulation of the genome and thus allow previously untestable hypotheses of gene function to be addressed.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Sequenced genomes reveal mutations that disable single genes and can help to identify new drugs

Sequenced genomes reveal mutations that disable single genes and can help to identify new drugs | Amazing Science | Scoop.it
On average, every person carries mutations that inactivate at least one copy of 200 or so genes and both copies of around 20 genes. However, knockout mutations in any particular gene are rare, so very large populations are needed to study their effects. These ‘loss of function’ mutations have long been implicated in certain debilitating diseases, such as cystic fibrosis. Most, however, seem to be harmless — and some are even beneficial to the persons carrying them. “These are people we’re not going to find in a clinic, but they’re still really informative in biology,” says MacArthur.

His group and others had been focusing on genome data, but they are now also starting to mine patient-health records to determine the — sometimes subtle — effects of the mutations. In a study of more than 36,000 Finnish people, published in July (E. T. Lim et al. PLoS Genet. 10, e1004494; 2014), MacArthur and his team discovered that people lacking a gene called LPA might be protected from heart disease, and that another knockout mutation, carried in one copy of a gene by up to 2.4% of Finns, may cause fetuses to miscarry if it is present in both copies.

Bing Yu of the University of Texas Health Science Center in Houston told the meeting how he and his collaborators had compared knockout mutations found in more than 1,300 people with measurements of around 300 molecules in their blood. The team found that mutations in one gene, called SLCO1B1, were linked to high levels of fatty acids, a known risk factor for heart failure. And a team from the Wellcome Trust Sanger Institute in Hinxton, UK, reported that 43 genes whose inactivation is lethal to mice were found to be inactivated in humans who are alive and apparently well.


The poster child for human-knockout efforts is a new class of drugs that block a gene known as PCSK9 (see Nature 496, 152–155; 2013). The gene was discovered in French families with extremely high cholesterol levels in the early 2000s. But researchers soon found that people with rare mutations that inactivate one copy of PCSK9 have low cholesterol and rarely develop heart disease. The first PCSK9-blocking drugs should hit pharmacies next year, with manufacturers jostling for a share of a market that could reach US$25 billion in five years.


“I think there are hundreds more stories like PCSK9 out there, maybe even thousands,” in which a drug can mimic an advantageous loss-of-function mutation, says Eric Topol, director of the Scripps Translational Science Institute in La Jolla, California. Mark Gerstein, a bio­informatician at Yale University in New Haven, Connecticut, predicts that human knockouts will be especially useful for identifying drugs that treat diseases of ageing. “You could imagine there’s a gene that is beneficial to you as a 25-year-old, but the thing is not doing a good job for you when you’re 75.”

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Drying Amazon Could Be Major Carbon Concern Going Forward

Drying Amazon Could Be Major Carbon Concern Going Forward | Amazing Science | Scoop.it

The lungs of the planet are drying out, threatening to cause Earth to cough up some of its carbon reserves. The Amazon rainforest inhales massive amounts of carbon dioxide from the atmosphere, helping keep the globe’s carbon budget in balance (at least until human emissions started throwing that balance off). But as a new study shows, since 2000 drier conditions are causing a decrease in lung capacity. And if the Amazon’s breaths become more shallow, it’s possible a feedback loop could set in, further reducing lung capacity and throwing the carbon balance further out of whack.


The study, published in the Proceedings of the National Academy of Sciences on Monday, shows that a decline in precipitation has contributed to less healthy vegetation since 2000. “It’s well-established fact that a large part of Amazon is drying. We’ve been able to link that decline in precipitation to a decline in greenness over the last 10 years,” said Thomas Hilker, lead author of the study and forestry expert at Oregon State University.


Since 2000, rainfall has decreased by up to 25 percent across a vast swath of the southeastern Amazon, according to the new satellite analysis by Hilker. The cause of the decline in rainfall hasn’t been pinpointed, though deforestation and changes in atmospheric circulation are possible culprits.


The decrease mostly affected an area of tropical forest 12 times the size of California, as well as adjacent grasslands and other forest types. The browning of that area, which is in the southern Amazon, accounted for more than half the loss of greenness observed by satellites. While the decrease in greenness is comparatively small compared with the overall lushness of the rainforest, the impacts could be outsize.


That’s because the amount of carbon the Amazon stores is staggering. An estimated 120 billion tons of carbon are stashed in its plants and soil. Much of that carbon gets there via the forest flora that suck carbon dioxide out of the atmosphere. Worldwide, “it essentially takes up 25 percent of global carbon cycle that vegetation is responsible for,” Hilker said. “It’s a huge carbon stock.”


more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

255 Terabits/s: Researchers demonstrate record data transmission over new type of fiber

255 Terabits/s: Researchers demonstrate record data transmission over new type of fiber | Amazing Science | Scoop.it

Researchers at Eindhoven University of Technology (TU/e) in the Netherlands and the University of Central Florida (CREOL), report in the journal Nature Photonics the successful transmission of a record high 255 Terabits/s over a new type of fiber allowing 21 times more bandwidth than currently available in communication networks. This new type of fiber could be an answer to mitigating the impending optical transmission capacity crunch caused by the increasing bandwidth demand.

Due to the popularity of Internet services and emerging network of capacity-hungry datacentres, demand for telecommunication bandwidth is expected to continue at an exponential rate. To transmit more information through current optical glass fibers, an option is to increase the power of the signals to overcome the losses inherent in the glass from which the fibre is manufactured. However, this produces unwanted photonic nonlinear effects, which limit the amount of information that can be recovered after transmission over the standard fiber.


The team at TU/e and CREOL, led by dr. Chigo Okonkwo, an assistant professor in the Electro-Optical Communications (ECO) research group at TU/e and dr. Rodrigo Amezcua Correa, a research assistant professor in Micro-structured fibers at CREOL, demonstrate the potential of a new class of fiber to increase transmission capacity and mitigate the impending 'capacity crunch' in their article that appeared yesterday in the online edition of the journal Nature Photonics.


The new fiber has seven different cores through which the light can travel, instead of one in current state-of-the-art fibers. This compares to going from a one-way road to a seven-lane highway. Also, they introduce two additional orthogonal dimensions for data transportation – as if three cars can drive on top of each other in the same lane. Combining those two methods, they achieve a gross transmission throughput of 255 Terabits/s over the fiber link. This is more than 20 times the current standard of 4-8 Terabits/s.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

WIRED: A Brief History of Mind-Bending Ideas About Black Holes

WIRED: A Brief History of Mind-Bending Ideas About Black Holes | Amazing Science | Scoop.it

The story starts in 1784, when a geologist named John Michell was thinking deeply about Isaac Newton’s theory of gravity. In Newtonian physics, a cannonball can be shot into orbit around the Earth if it surpasses a particular speed, known as the planet’s escape velocity.


This speed depends on the mass and radius of the object you are trying to escape from. Michell’s insight was to imagine a body whose escape velocity was so great that it exceeded the speed of light – 300,000 kilometers per second – first measured in 1676 by the Danish astronomer Ole Romer.


Michell presented his results to other scientists, who speculated that massive “dark stars” might exist in abundance in the sky but be invisible because light can’t escape their surfaces. The French mathematician Pierre-Simon Laplace later made an independent discovery of these “dark stars” and both luminaries correctly calculated the very small radius – 6 kilometers – such an object would have if it were as massive as our sun.


After the revolutions of 20th century physics, black holes got much weirder. In 1916, a short while after Einstein published the complex equations underpinning General Relativity (which Einstein himself couldn’t entirely solve), a German astronomer named Karl Schwarzschild showed that a massive object squeezed to a single point would warp space around it so much that even light couldn’t escape. Though the cartoon version of black holes has them sucking everything up like a vacuum cleaner, light would only be unable to escape Schwarzschild’s object if it was inside a particular radius, called the Schwarzschild radius. Beyond this “event horizon,” you could safely leave the vicinity of a black hole.


Neither Schwarzschild nor Einstein believed this object was anything other than a mathematical curiosity. It took a much better understanding of the lives of stars before black holes were taken seriously. You see, a star only works because it preserves a delicate balance between gravity, which is constantly trying to pull its mass inward, and the nuclear furnace in its belly, which exerts pressure outward. At some point a star runs out of fuel and the fusion at its core turns off. Gravity is given the upper hand, causing the star to collapse. For stars like our sun, this collapse is halted when the electrons in the star’s atoms get so close that they generate a quantum mechanical force called electron degeneracy pressure. An object held up by this pressure is called a white dwarf.


In 1930, the Indian physicist Subrahmanyan Chandrasekhar showed that, given enough mass, a star’s gravity could overcome this electron degeneracy pressure, squeezing all its protons and electrons into neutrons. Though a neutron degeneracy pressure could then hold the weight up, forming a neutron star, the physicist Robert Oppenheimer found that an even more massive object could overcome this final outward pressure, allowing gravity to win and crushing everything down to a single point. Scientists slowly accepted that these things were real objects, not just weird mathematical solutions to the equations of General Relativity. In 1967, physicist John Wheeler used the term “black hole” to describe them in a public lecture, a name that has stuck ever since.



more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Breakthrough in molecular electronics paves the way for DNA-based computer circuits in the future

Breakthrough in molecular electronics paves the way for DNA-based computer circuits in the future | Amazing Science | Scoop.it

Molecular electronics, which uses molecules as building blocks for the fabrication of electronic components, was seen as the ultimate solution to the miniaturization challenge. However, to date, no one has actually been able to make complex electrical circuits using molecules. The only known molecules that can be pre-designed to self-assemble into complex miniature circuits, which could in turn be used in computers, are DNA molecules. Nevertheless, so far no one has been able to demonstrate reliably and quantitatively the flow of electrical current through long DNA molecules.


Now, an international group led by Prof. Danny Porath, the Etta and Paul Schankerman Professor in Molecular Biomedicine at the Hebrew University of Jerusalem, reports reproducible and quantitative measurements of electricity flow through long molecules made of four DNA strands, signaling a significant breakthrough towards the development of DNA-based electrical circuits. The research, which could re-ignite interest in the use of DNA-based wires and devices in the development of programmable circuits, appears in the prestigious journal Nature Nanotechnology under the title "Long-range charge transport in single G-quadruplex DNA molecules."


Prof. Porath is affiliated with the Hebrew University's Institute of Chemistry and its Center for Nanoscience and Nanotechnology. The molecules were produced by the group of Alexander Kotlyar from Tel Aviv University, who has been collaborating with Porath for 15 years. The measurements were performed mainly by Gideon Livshits, a PhD student in the Porath group, who carried the project forward with great creativity, initiative and determination. The research was carried out in collaboration with groups from Denmark, Spain, US, Italy and Cyprus.


According to Prof. Porath, "This research paves the way for implementing DNA-based programmable circuits for molecular electronics, which could lead to a new generation of computer circuits that can be more sophisticated, cheaper and simpler to make."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

New evidence for an exotic, predicted superconducting state found

New evidence for an exotic, predicted superconducting state found | Amazing Science | Scoop.it
A research team led by a Brown University physicist has produced new evidence for an exotic superconducting state, first predicted a half-century ago, that can arise when a superconductor is exposed to a strong magnetic field.


"It took 50 years to show that this phenomenon indeed happens," said Vesna Mitrovic, associate professor of physics at Brown University, who led the work. "We have identified the microscopic nature of this exotic quantum state of matter."


The research is published in Nature Physics.


Superconductivity—the ability to conduct electric current without resistance—depends on the formation of electron twosomes known as Cooper pairs (named for Leon Cooper, a Brown University physicist who shared the Nobel Prize for identifying the phenomenon). In a normal conductor, electrons rattle around in the structure of the material, which creates resistance. But Cooper pairs move in concert in a way that keeps them from rattling around, enabling them to travel without resistance.


Magnetic fields are the enemy of Cooper pairs. In order to form a pair, electrons must be opposites in a property that physicists refer to as spin. Normally, a superconducting material has a roughly equal number of electrons with each spin, so nearly all electrons have a dance partner. But strong magnetic fields can flip "spin-down" electrons to "spin-up", making the spin population in the material unequal.


"The question is what happens when we have more electrons with one spin than the other," Mitrovic said. "What happens with the ones that don't have pairs? Can we actually form superconducting states that way, and what would that state look like?"


In 1964, physicists predicted that superconductivity could indeed persist in certain kinds of materials amid a magnetic field. The prediction was that the unpaired electrons would gather together in discrete bands or stripes across the superconducting material. Those bands would conduct normally, while the rest of the material would be superconducting. This modulated superconductive state came to be known as the FFLO phase, named for theorists Peter Fulde, Richard Ferrell, Anatoly Larkin, and Yuri Ovchinniko, who predicted its existence. To investigate the phenomenon, Mitrovic and her team used an organic superconductor with the catchy name κ-(BEDT-TTF)2Cu(NCS)2. The material consists of ultra-thin sheets stacked on top of each other and is exactly the kind of material predicted to exhibit the FFLO state.


After applying an intense magnetic field to the material, Mitrovic and her collaborators from the French National High Magnetic Field Laboratory in Grenoble probed its properties using nuclear magnetic resonance (NMR). What they found were regions across the material where unpaired, spin-up electrons had congregated. These "polarized" electrons behave, "like little particles constrained in a box," Mitrovic said, and they form what are known as Andreev bound states.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

To beat malaria and dengue fever, vaccinate the mosquitoes

To beat malaria and dengue fever, vaccinate the mosquitoes | Amazing Science | Scoop.it

If there’s one thing the malaria parasite wants, it’s to get inside the guts of a mosquito. Once there, it releases hundreds of wormlike cells that enter the human body through a bloodsucking bite. Now, scientists have found a way to make mosquitoes much less hospitable to this pathogen, as well as the one that causes dengue: stacking the insect’s gut with killer microbes that wipe out the invaders before they have a chance to cause disease.


Like humans and most other animals, mosquitoes are stuffed with microbes that live on and inside of them—their microbiome. When studying the microbes that make mosquitoes their home, researchers came across one called Chromobacterium sp. (Csp_P). They already knew that Csp_P’s close relatives were capable of producing powerful antibiotics, and they wondered if Csp_P might share the same talent.


The team cultured Csp_P in a sugar solution and in blood and fed both concoctions to mosquitoes whose natural microbiomes had already been eliminated with doses of antibiotics. As the scientists hoped, Csp_P quickly took over the mosquito’s gut after being ingested by means of the sugar solution—and even more quickly when it was fed to them in blood. In another experiment, done with mosquitoes that weren’t pretreated with antibiotics, Csp_P-fed mosquitoes were given blood containing the dengue virus and Plasmodium falciparum, a single-celled parasite that causes the most deadly type of malaria. Although a large number of the mosquitoes died within a few days of being infected by the Chromobacteriumthe malaria and dengue pathogens were far less successful at infecting the mosquitoes that did survive, the team reports today in PLOS Pathogens. That’s good news: If the mosquito isn’t infected by the disease-causing germs, it is less likely to be able to transmit the pathogens to humans.


The team, from Johns Hopkins University in Baltimore, Maryland, also exposed the malaria parasite and dengue virus to lab cultures of Csp_P to test for anti-Plasmodium and antidengue activity. Here, too, they found that the bacteria inhibited the growth of the pathogens.


The researchers say there could be two mechanisms by which Csp_P fights off Plasmodiumand dengue infections. First, because Csp_P is toxic to mosquitoes, it activates the insect’s immune system. This has the collateral benefit of staving off infection from Plasmodium and dengue virus, which otherwise would have thrived in the mosquito’s gut. But that’s not all, says George Dimopoulos, a parasitologist at Johns Hopkins who led the research team. Because the bacterium also snuffs out Plasmodium and the dengue virus in the laboratory, it means Csp_P is producing toxic compounds that are killing the pathogens directly.


Dimopoulos and his colleagues believe Csp_P could be used to “vaccinate” mosquitoes against the malaria and dengue pathogens, perhaps through the use of sugar-baited traps that are already used to spread insecticide through populations of the pest. This would have the twin effect of killing most mosquitoes while severely curbing the survivors’ ability to spread disease. This one-two punch is “a unique property” for any malaria-control agent, says David Fidock, a microbiologist at Columbia University, who was not involved in the study. “No current malaria-control agent does both.”

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Human skin cells reprogrammed directly into brain cells with combination of microRNAs and transcription factors

Human skin cells reprogrammed directly into brain cells with combination of microRNAs and transcription factors | Amazing Science | Scoop.it

Scientists have described a way to convert human skin cells directly into a specific type of brain cell affected by Huntington’s disease, an ultimately fatal neurodegenerative disorder. Unlike other techniques that turn one cell type into another, this new process does not pass through a stem cell phase, avoiding the production of multiple cell types, the study’s authors report.


The researchers, at Washington University School of Medicine in St. Louis, demonstrated that these converted cells survived at least six months after injection into the brains of mice and behaved similarly to native cells in the brain.


“Not only did these transplanted cells survive in the mouse brain, they showed functional properties similar to those of native cells,” said senior author Andrew S. Yoo, PhD, assistant professor of developmental biology. “These cells are known to extend projections into certain brain regions. And we found the human transplanted cells also connected to these distant targets in the mouse brain. That’s a landmark point about this paper.”


The work appears Oct. 22 in the journal Neuron.


The investigators produced a specific type of brain cell called medium spiny neurons, which are important for controlling movement. They are the primary cells affected in Huntington’s disease, an inherited genetic disorder that causes involuntary muscle movements and cognitive decline usually beginning in middle-adulthood. Patients with the condition live about 20 years following the onset of symptoms, which steadily worsen over time. 

The research involved adult human skin cells, rather than more commonly studied mouse cells or even human cells at an earlier stage of development. In regard to potential future therapies, the ability to convert adult human cells presents the possibility of using a patient’s own skin cells, which are easily accessible and won’t be rejected by the immune system.


To reprogram these cells, Yoo and his colleagues put the skin cells in an environment that closely mimics the environment of brain cells. They knew from past work that exposure to two small molecules of RNA, a close chemical cousin of DNA, could turn skin cells into a mix of different types of neurons.


In a skin cell, the DNA instructions for how to be a brain cell, or any other type of cell, is neatly packed away, unused. In past research published in Nature, Yoo and his colleagues showed that exposure to two microRNAs called miR-9 and miR-124 altered the machinery that governs packaging of DNA. Though the investigators still are unraveling the details of this complex process, these microRNAs appear to be opening up the tightly packaged sections of DNA important for brain cells, allowing expression of genes governing development and function of neurons.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Weeks after winning a Nobel Prize for his microscope, Eric Betzig just revolutionized microscopy again

Weeks after winning a Nobel Prize for his microscope, Eric Betzig just revolutionized microscopy again | Amazing Science | Scoop.it
The new technique allows high-speed, high-resolution shots of cells in action.


Earlier this month Eric Betzig shared the Nobel Prize in chemistry for his work on high-resolution microscopes -- specifically the one he'd designed and built on a friend's living room floorBut when Betzig, a researcher at the Howard Hughes Medical Institute's Janelia Research Campus in Ashburn, Virginia, got news of his win, his best work yet was still a few weeks away from being published.Thursday in Science, he and a team of his colleagues reported on a new microscopy technique that allows them to observe living cellular processes at groundbreaking resolution and speed. Betzig came up with his Nobel-winning microscope (PALM) when he'd grown frustrated with the limitations of other microscope technologies. The so-called lattice light-sheet microscopy that he describes in Thursday's paper was the result of his eventual boredom with PALM.


"Again, I just started to understand the limits of the technology," Betzig said. PALM was great at looking at living systems, but only when they moved slowly. It couldn't take  measurements quickly enough to get high-resolution pictures of fast cellular divisions.


Trying to understand biology using these microscopes is like piecing together a football game from high-resolution photos, Betzig said: You can see images of a pass, and a touchdown, and of the cheerleaders doing a pyramid. But the rules of the game would only become clear once you saw a game on video.


"I'd been looking at those pictures my whole life," Betzig said. "It was time to take a look at the living stuff in action." Until now, the best microscope for viewing living systems as they moved were confocal microscopes. They beam light down onto a sample of cells. The light penetrates the whole sample and bounces back.


But even though a scientist can only focus his lens on one small section of the sample, light is being blasted onto the cells from above and below. This causes two problems: It creates a sort of haze around the area being focused on, and it's also damaging to the cell sample. The light is toxic, and degrades the living system over time. Betzig's new microscope solves this by generating a sheet of light that comes in from the side of the sample, made up of a series of beams that harm the sample less than one solid cone of light. Scientists can now snap a high-res image of the entire section they're illuminating, without exposing the rest of the sample to any light at all.

more...
No comment yet.