Amazing Science
Follow
336.4K views | +42 today
Scooped by Dr. Stefan Gruenwald
onto Amazing Science
Scoop.it!

Most abundant ocean viruses attack bacteria that are important for the carbon cycle

Most abundant ocean viruses attack bacteria that are important for the carbon cycle | Amazing Science | Scoop.it

In one corner is the Earth’s most abundant organism: SAR11, an ocean-living bacterium that survives where most other cells would die and plays a major role in the planet’s carbon cycle. It had been theorized that SAR11 was so small and widespread that it must be invulnerable to attack.

 

In the other corner, and so strange looking that scientists previously didn’t even recognize what they were, are “Pelagiphages,” viruses now known to infect SAR11 and routinely kill millions of these cells every second.

 

How this fight turns out is of more than casual interest, because SAR11 has a huge effect on the amount of carbon dioxide that enters the atmosphere, and the overall biology of the oceans.

 

“There’s a war going on in our oceans, a huge war, and we never even saw it,” says Stephen Giovannoni, a professor of microbiology at Oregon State University. “This is an important piece of the puzzle in how carbon is stored or released in the sea.” The analysis shows that the new viruses—like their hosts—are the most abundant on record.

 

The paper in Nature describes four previously unknown viruses that infect SAR11. To prove the viruses were as abundant as their hosts, Giovannoni and colleagues teamed up with researchers at the University of Arizona’s Tucson Marine Phage Research Lab, led by Matthew Sullivan, who had developed accurate methods for measuring viral diversity in nature.

 

The analysis shows that the new viruses—like their hosts—are the most abundant on record. Giovannoni’s group discovered the Pelagiphage viral families by using “old-fashioned” research methods, growing the cells and viruses in a laboratory, instead of the tools of modern genomics, and found the new type of virus.

 

“Because they are so new, these viruses were virtually unrecognizable to us based on their DNA,” Giovannoni says. “The viruses themselves, of course, appear to be just as abundant as SAR11.”

 

Sullivan explains the method for discovering viruses in the oceans based on their genomes his group developed over four years is at least 1,000 times more accurate than previous methods.

 

Their work resulted in the Pacific Ocean Virus dataset. This dataset, Sullivan explains, is the viral equivalent of the Global Ocean Sampling Expedition by former human genome researcher J. Craig Venter, who sailed across the world’s oceans sampling, sequencing, and analyzing the DNA of the microorganisms living in these waters. The new findings on SAR11 disprove the theory that the bacteria are immune to viral predation, Giovannoni and his co-authors say.

 

“In general, every living cell is vulnerable to viral infection,” says Giovannoni, who first discovered SAR11 in 1990. “What has been so puzzling about SAR11 was its sheer abundance, there was simply so much of it that some scientists believed it must not get attacked by viruses.” What the new research shows, Giovannoni says, is that SAR11 is competitive, good at scavenging organic carbon, and effective at changing quickly to avoid infection. Because of this, it thrives and persists in abundance even though the new viruses are constantly killing it.

 

more...
No comment yet.
Your new post is loading...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

20,000+ FREE Online Science and Technology Lectures from Top Universities

20,000+ FREE Online Science and Technology Lectures from Top Universities | Amazing Science | Scoop.it

NOTE: To subscribe to the RSS feed of Amazing Science, copy http://www.scoop.it/t/amazing-science/rss.xml into the URL field of your browser and click "subscribe".

 

This newsletter is aggregated from over 1450 news sources:

http://www.genautica.com/links/1450_news_sources.html

 

All my Tweets and Scoop.It! posts sorted and searchable:

http://www.genautica.com/tweets/index.html

 

••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••

You can search through all the articles semantically on my

archived twitter feed

••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••

NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen)  and display all the relevant postings SORTED by TOPICS.

 

You can also type your own query:

 

e.g., you are looking for articles involving "dna" as a keyword

 

http://www.scoop.it/t/amazing-science/?q=dna


Or CLICK on the little FUNNEL symbol at the top right of the screen

••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••


MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video 

more...
Casper Pieters's curator insight, March 9, 7:21 PM

Great resources for online learning just about everything.  All you need is will power and self- discipline.

Russ Roberts's curator insight, April 23, 11:37 PM

A very interesting site.  Amazing Science covers many disciplines.  Subscribe to the news letter and be " amazed." Aloha, Russ, KH6JRM. 

Siegfried Holle's curator insight, July 4, 8:45 AM

Your knowledge is your strength and power 

Scooped by Dr. Stefan Gruenwald
Scoop.it!

New correction to speed of light could explain SN1987 dual-pulse neutrino burst

New correction to speed of light could explain SN1987 dual-pulse neutrino burst | Amazing Science | Scoop.it

The effect of gravity on virtual electron–positron pairs as they propagate through space could lead to a violation of Einstein's equivalence principle, according to calculations by James Franson at the University of Maryland, Baltimore County. While the effect would be too tiny to be measured directly using current experimental techniques, it could explain a puzzling anomaly observed during the famous SN1987 supernova of 1987.


In modern theoretical physics, three of the four fundamental forces – electromagnetism, the weak nuclear force and the strong nuclear force – are described by quantum mechanics. The fourth force, gravity, does not currently have a quantum formulation and is best described by Einstein's general theory of relativity. Reconciling relativity with quantum mechanics is therefore an important and active area of physics.


An open question for theoretical physicists is how gravity acts on a quantum object such as a photon. Astronomical observations have shown repeatedly that light is attracted by a gravitational field. Traditionally, this is described using general relativity: the gravitational field bends space–time, and the light is slowed down (and slightly deflected) as it passes through the curved region. In quantum electrodynamics, a photon propagating through space can occasionally annihilate with itself, creating a virtual electron–positron pair. Soon after, the electron and positron recombine to recreate the photon. If they are in a gravitational potential then, for the short time they exist as massive particles, they feel the effect of gravity. When they recombine, they will create a photon with an energy that is shifted slightly and that travels slightly slower than if there was no gravitational potential.


Franson scrutinized these two explanations for why light slows down as it passes through a gravitational potential. He decided to calculate how much the light should slow down according to each theory, anticipating that he would get the same answer. However, he was in for a surprise: the predicted changes in the speed of light do not match, and the discrepancy has some very strange consequences.


Franson calculated that, treating light as a quantum object, the change in a photon's velocity depends not on the strength of the gravitational field, but on the gravitational potential itself. However, this leads to a violation of Einstein's equivalence principle – that gravity and acceleration are indistinguishable – because, in a gravitational field, the gravitational potential is created along with mass, whereas in a frame of reference accelerating in free fall, it is not. Therefore, one could distinguish gravity from acceleration by whether a photon slows down or not when it undergoes particle–antiparticle creation.


An important example is a photon and a neutrino propagating in parallel through space. A neutrino cannot annihilate to create an electron–positron pair, so the photon will slow down more than the neutrino as they pass through a gravitational field, potentially letting the neutrino travel faster than light through that region of space. However, if the problem is viewed in a frame of reference falling freely into the gravitational field, neither the photon nor the neutrino slows down at all, so the photon continues to travel faster than the neutrino.


While the idea that the laws of physics can be dependent on one's frame of reference seems nonsensical, it could explain an anomaly in the 1987 observation of supernova SN1987a. An initial pulse of neutrinos was detected 7.7 hours before the first light from SN1987a reached Earth. This was followed by a second pulse of neutrinos, which arrived about three hours before the supernova light. Supernovae are expected to emit large numbers of neutrinos and the three-hour gap between the second burst of neutrinos and the arrival of the light agrees with the current theory of how a star collapses to create a supernova.


The first pulse of neutrinos is generally thought to be unrelated to the supernova. However, the probability of such a coincidence is statistically unlikely. If Franson's results are correct, then the 7.7-hour gap between the first pulse of neutrinos and the arrival of the light could be explained by the gravitational potential of the Milky Way slowing down the light. This does not explain why two neutrino pulses preceded the light, but Franson suggests the second pulse could be related to a two-step collapse of the star.


The research is published in the New Journal of Physics.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

NASA's next Mars rover will make oxygen, to sustain life

NASA's next Mars rover will make oxygen, to sustain life | Amazing Science | Scoop.it

For 17 years, NASA rovers have laid down tire tracks on Mars. But details the space agency divulged this week about its next Martian exploration vehicle underscored NASA's ultimate goal. Footprints are to follow someday.


The last three rovers -- Spirit, Opportunity and Curiosity -- confirmed the Red Planet's ability to support life and searched for signs of past life. The Mars rover of the next decade will hone in on ways to sustain future life there, human life.


"The 2020 rover will help answer questions about the Martian environment that astronauts will face and test technologies they need before landing on, exploring and returning from the Red Planet," said NASA's William Gerstenmaier who works on human missions. This will include experiments that convert carbon dioxide in the Martian atmosphere into oxygen "for human respiration." Oxygen could also be used on Mars in making rocket fuel that would allow astronauts to refill their tanks.


The 2020 rover is the near spitting image of Curiosity and NASA's Jet Propulsion Laboratory announced plans to launch the new edition not long after Curiosity landed on Mars in 2012. But the 2020 rover has new and improved features. The Mars Oxygen ISRU Experiment, or MOXIE for short, is just one. There are super cameras that will send back 3D panoramic images and spectrometers that will analyze the chemical makeup of minerals with an apparent eye to farming.


"An ability to live off the Martian land would transform future exploration of the planet," NASA said in a statement. The 2020 rover will also create a job for a future mission to complete, once the technology emerges to return to Earth from Mars. It will collect soil samples to be sent back for lab analysis at NASA.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Nanostructured metal-oxide catalyst efficiently converts CO2 to methanol

Nanostructured metal-oxide catalyst efficiently converts CO2 to methanol | Amazing Science | Scoop.it

Scanning tunneling microscope image of a cerium-oxide and copper catalyst (CeOx-Cu) used in the transformation of carbon dioxide (CO2) and hydrogen (H2) gases.


Scientists at Brookhaven National Laboratory have discovered a new catalytic system for converting carbon dioxide (CO2) to methanol — a key commodity used to create a wide range of industrial chemicals and fuels. With significantly higher activity than other catalysts now in use, the new system could make it easier to get normally unreactive CO2 to participate in these reactions.


“Developing an effective catalyst for synthesizing methanol from CO2 could greatly expand the use of this abundant gas as an economical feedstock,” said Brookhaven chemist Jose Rodriguez, who led the research. “It’s even possible to imagine a future in which such catalysts help capture CO2 emitted from methanol-powered combustion engines and fuel cells, and recycling it to synthesize new fuel,” he said.

That future, of course, will be determined by a variety of factors, including economics.


The research team, which included scientists from Brookhaven, the University of Seville in Spain, and Central University of Venezuela, describes their results in the August 1, 2014, issue of the journal Science.


Because CO2 is normally such a reluctant participant in chemical reactions, interacting weakly with most catalysts, it’s also rather difficult to study. The new studies required the use of newly developed in-situ (or on-site, meaning under reaction conditions) imaging and chemical “fingerprinting” techniques.


These techniques allowed the scientists to peer into the dynamic evolution of a variety of catalysts as they operated in real time. The scientists also used computational modeling at the University of Seville and the Barcelona Supercomputing Center to provide a molecular description of the methanol synthesis mechanism.


The team was particularly interested in exploring a catalyst composed of copper and ceria (cerium-oxide) nanoparticles, sometimes also mixed with titania. The scientists’ previous studies with such metal-oxide nanoparticle catalysts have demonstrated their exceptional reactivity in a variety of reactions. In those studies, the interfaces of the two types of nanoparticles turned out to be critical to the reactivity of the catalysts, with highly reactive sites forming at regions where the two phases meet.


To explore the reactivity of such dual particle catalytic systems in converting CO2 to methanol, the scientists used spectroscopic techniques to investigate the interaction of CO2 with plain copper, plain cerium-oxide, and cerium-oxide/copper surfaces at a range of reaction temperatures and pressures. Chemical fingerprinting was combined with computational modeling to reveal the most probable progression of intermediates as the reaction from CO2 to methanol proceeded.

These studies revealed that the metal component of the catalysts alone could not carry out all the chemical steps necessary for the production of methanol. The most effective binding and activation of CO2 occurred at the interfaces between metal and oxide nanoparticles in the cerium-oxide/copper catalytic system.


“The key active sites for the chemical transformations involved atoms from the metal [copper] and oxide [ceria or ceria/titania] phases,” said Jesus Graciani, a chemist from the University of Seville and first author on the paper. The resulting catalyst converts CO2 to methanol more than a thousand times faster than plain copper particles, and almost 90 times faster than a common copper/zinc-oxide catalyst currently in industrial use.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Comprehensive molecular characterization of gastric cancer shows 4 major types

Comprehensive molecular characterization of gastric cancer shows 4 major types | Amazing Science | Scoop.it

Gastric cancer was the world’s third leading cause of cancer mortality in 2012, responsible for 723,000 deaths1. The vast majority of gastric cancers are adenocarcinomas, which can be further subdivided into intestinal and diffuse types according to the Lauren classification2. An alternative system, proposed by the World Health Organization, divides gastric cancer into papillary, tubular, mucinous (colloid) and poorly cohesive carcinomas3. These classification systems have little clinical utility, making the development of robust classifiers that can guide patient therapy an urgent priority.


Molecular analysis of its molecular and clinical characteristics has been complicated by histological and etiological heterogeneity. A team of scientists now describes in Nature a comprehensive molecular evaluation of 295 primary gastric adenocarcinomas as part of The Cancer Genome Atlas (TCGA) project. They propose a molecular classification dividing gastric cancer into four subtypes: (i) tumors positive for Epstein–Barr virus, which display recurrentPIK3CA mutations, extreme DNA hypermethylation, and amplification of JAK2CD274 (also known as PD-L1) and PDCD1LG2 (also known as PD-L2); (ii) microsatellite unstable tumors, which show elevated mutation rates, including mutations of genes encoding targetable oncogenic signaling proteins; (iii) genomically stable tumors, which are enriched for the diffuse histological variant and mutations of RHOA or fusions involving RHO-family GTPase-activating proteins; and (iv) tumors with chromosomal instability, which show marked aneuploidy and focal amplification of receptor tyrosine kinases. Identification of these subtypes provides a roadmap for patient stratification and trials of targeted therapies.


The majority of gastric cancers are associated with infectious agents, including the bacterium Helicobacter pylori4 and Epstein–Barr virus (EBV). The distribution of histological subtypes of gastric cancer and the frequencies of H. pylori and EBV associated gastric cancer vary across the globe5. A small minority of gastric cancer cases are associated with germline mutation in E-cadherin (CDH1)6 or mismatch repair genes7 (Lynch syndrome), whereas sporadic mismatch repair-deficient gastric cancers have epigenetic silencing of MLH1 in the context of a CpG island methylator phenotype (CIMP)8. Molecular profiling of gastric cancer has been performed using gene expression or DNA sequencing9101112, but has not led to a clear biologic classification scheme. The goals of this study by The Cancer Genome Atlas (TCGA) were to develop a robust molecular classification of gastric cancer and to identify dysregulated pathways and candidate drivers of distinct classes of gastric cancer.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Mysterious Siberian crater attributed to methane released from thawing permafrost

Mysterious Siberian crater attributed to methane released from thawing permafrost | Amazing Science | Scoop.it

A mystery crater spotted in the frozen Yamal peninsula in Siberia earlier this month was probably caused by methane released as permafrost thawed, researchers in Russia say.


Air near the bottom of the crater contained unusually high concentrations of methane — up to 9.6% — in tests conducted at the site on 16 July, says Andrei Plekhanov, an archaeologist at the Scientific Centre of Arctic Studies in Salekhard, Russia. Plekhanov, who led an expedition to the crater, says that air normally contains just 0.000179% methane.


Since the hole was spotted in mid-July by a helicopter pilot, conjecture has abounded about how the 30-metre-wide crater was formed — a gas or missile explosion, a meteorite impact and alien involvement have all been suggested.


But Plekhanov and his team believe that it is linked to the abnormally hot Yamal summers of 2012 and 2013, which were warmer than usual by an average of about 5°C. As temperatures rose, the researchers suggest, permafrost thawed and collapsed, releasing methane that had been trapped in the icy ground.

Other researchers argue that long-term global warming might be to blame — and that a slow and steady thaw in the region could have been enough to free a burst of methane and create such a big crater. Over the past 20 years, permafrost at a depth of 20 metres has warmed by about 2°C, driven by rising air temperatures1, notes Hans-Wolfgang Hubberten, a geochemist at the Alfred Wegener Institute in Potsdam, Germany.
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Bioinformatics: Big data from DNA Sequencing is giving new Insights into Cancer Development and Treatment Options

Bioinformatics: Big data from DNA Sequencing is giving new Insights into Cancer Development and Treatment Options | Amazing Science | Scoop.it
The torrents of data flowing out of cancer research and treatment are yielding fresh insight into the disease.


In 2013, geneticist Stephen Elledge answered a question that had puzzled cancer researchers for nearly 100 years. In 1914, German biologist Theodor Boveri suggested that the abnormal number of chromosomes — called aneuploidy — seen in cancers might drive the growth of tumors. For most of the next century, researchers made little progress on the matter. They knew that cancers often have extra or missing chromosomes or pieces of chromosomes, but they did not know whether this was important or simply a by-product of tumor growth — and they had no way of finding out.


Elledge found that where aneuploidy had resulted in missing tumor-suppressor genes, or extra copies of the oncogenes that promote cancer, tumors grow more aggressively (T. Davoli et al.Cell 1559489622013). His insight — that aneuploidy is not merely an odd feature of tumors, but an engine of their growth — came from mining voluminous amounts of cellular data. And, says Elledge, it shows how the ability of computers to sift through ever-growing troves of information can help us to deepen our understanding of cancer and open the door to discoveries.


Modern cancer care has the potential to generate huge amounts of data. When a patient is diagnosed, the tumor's genome might be sequenced to see if it is likely to respond to a particular drug. The sequencing might be repeated as treatment progresses to detect changes. The patient might have his or her normal tissue sequenced as well, a practice that is likely to grow as costs come down. The doctor will record the patient's test results and medical history, including dietary and smoking habits, in an electronic health record. The patient may also have computed tomography (CT) and magnetic resonance imaging (MRI) scans to determine the stage of the disease. Multiply all that by the nearly 1.7 million people diagnosed with cancer in 2013 in the United States alone and it becomes clear that oncology is going to generate even more data than it does now. Computers can mine the data for patterns that may advance the understanding of cancer biology and suggest targets for therapy.


Elledge's discovery was the result of a computational method that he and his colleagues developed, called the Tumor Suppressor and Oncogene Explorer. They used it to mine large data sets, including the Cancer Genome Atlas, maintained by the US National Cancer Institute, based in Bethesda, Maryland, and the Catalogue of Somatic Mutations in Cancer, run by the Wellcome Trust Sanger Institute in Hinxton, UK. The databases contained roughly 1.2 million mutations from 8,207 tissue samples of more than 20 types of tumor.


Analyzing the genomes of 8,200 tumors is just a start. Researchers are “trying to figure out how we can bring together and analyze, over the next few years, a million genomes”, says Robert Grossman, who directs the Initiative in Data Intensive Science at the University of Chicago in Illinois. This is an immense undertaking; the combined cancer genome and normal genome from a single patient constitutes about 1 terabyte (1012 bytes) of data, so a million genomes would generate an exabyte (1018 bytes). Storing and analysing this much data could cost US$100 million a year, Grossman says.


But it is the new technologies that are creating an information boom. “We can collect data faster than we can physically do anything with them,” says Manish Parashar, a computer scientist and head of the Rutgers Discovery Informatics Institute in Piscataway, New Jersey, who collaborates with Foran to find ways of handling the information. “There are some fundamental challenges being caused by our ability to capture so much data,” he says.


A major problem with data sets at the terabyte-and-beyond level is figuring out how to manipulate all the data. A single high-resolution medical image can take up tens of gigabytes, and a researcher might want the computer to compare tens of thousands of such images. Breaking down just one image in the Rutgers project into sets of pixels that the computer can identify takes about 15 minutes, and moving that much information from where it is stored to where it can be processed is difficult. “Already we have people walking around with disk drives because you can't effectively use the network,” Parashar says.


Informatics researchers are developing algorithms to split data into smaller packets for parallel processing on separate processors, and to compress files without omitting any relevant information. And they are relying on advances in computer science to speed up processing and communications in general.


Foran emphasizes that the understanding and treatment of cancer has undergone a dramatic shift as oncology has moved from one-size-fits-all attacks on tumours towards personalized medicine. But cancers are complex diseases controlled by many genes and other factors. “It's not as if you're going to solve cancer,” he says. But big data can provide new, better-targeted ways of grappling with the disease. “You're going to come up with probably a whole new set of blueprints for how to treat patients.”


more...
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Scientists develop a 'nanovesicle' that delivers complementary molecules inside cells

Scientists develop a 'nanovesicle' that delivers complementary molecules inside cells | Amazing Science | Scoop.it

Researchers at the University of Miami and the University of Ulster have created self-assembling nanoparticles that can transport drugs and other molecules into target living cells.


The new nanocarriers are just 15 nanometers in diameter, based on building blocks called amphiphilic polymers: they have both hydrophilic (water-loving, polar) and lipophilic (fat-loving) properties). That allows the nanocarriers to hold the guest molecules within their water-insoluble interior and use their water-soluble exterior to travel through an aqueous environment. And that makes the nanocarriers ideal for transferring molecules that would otherwise be insoluble in water.


They also emit a fluorescent signal that can be observed with a microscope, allowing for tracking and photographing the nanoparticles in the body.


“The size of these nanoparticles, their dynamic character and the fact that the reactions take place under normal biological conditions (at ambient temperature and neutral environment) makes these nanoparticles an ideal vehicle for the controlled activation of therapeutics, directly inside the cells,” says lead investigator Francisco Raymo, professor of chemistry in the University of Miami College of Arts and Sciences and UM laboratory for molecular photonics.


The next phase of this investigation involves demonstrating that this method can be used to achieve chemical reactions inside cells, instead of energy transfers.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Beyond GPS: Five next-generation technologies

Beyond GPS: Five next-generation technologies | Amazing Science | Scoop.it

Several DARPA programs are exploring innovative technologies and approaches that could supplement GPS to provide reliable, highly accurate real-time positioning, navigation and timing (PNT) data for military and civilian uses and deal with possible loss of GPS accuracy from solar storms or jamming, for example.


DARPA Director Arati Prabhakar  said DARPA currently has five programs that focus on PNT-related technology.


Adaptable Navigation Systems (ANS) is developing new algorithms and architectures that can create better inertial measurement devices. By using cold-atom interferometry, which measures the relative acceleration and rotation of a cloud of atoms stored within a sensor, extremely accurate inertial measurement devices could operate for long periods without needing external data to determine time and position. ANS also seeks to exploit non-navigational electromagnetic signals — including commercial satellite, radio and television signals and even lightning strikes — to provide additional points of reference for PNT.


Microtechnology for Positioning, Navigation, and Timing (Micro-PNT) leverages extreme miniaturization made possible by DARPA-developed micro-electromechanical systems (MEMS) technology. These include precise chip-scale gyroscopes, clocks, and complete integrated timing and inertial measurement devices. DARPA researchers have fabricated a prototype with three gyroscopes, three accelerometers and a highly accurate master clock on a chip that fits easily on the face of a penny.


Quantum-Assisted Sensing and Readout (QuASAR) intends to make the world’s most accurate atomic clocks — which currently reside in laboratories — both robust and portable. QuASAR researchers have developed optical atomic clocks in laboratories with a timing error of less than 1 second in 5 billion years. Making clocks this accurate and portable could improve upon existing military systems such as GPS, and potentially enable entirely new radar, LIDAR, and metrology applications.


The Program in Ultrafast Laser Science and Engineering (PULSE) applies the latest in pulsed laser technology to significantly improve the precision and size of atomic clocks and microwave sources, enabling more accurate time and frequency synchronization over large distances. It could enable global distribution of time precise enough to take advantage of the world’s most accurate optical atomic clocks.


The Spatial, Temporal and Orientation Information in Contested Environments (STOIC) program seeks to develop PNT systems that are independent of GPS: long-range robust reference signals, ultra-stable tactical clocks, and multifunctional systems that provide PNT information between multiples users.

more...
Russ Roberts's curator insight, July 31, 12:01 AM

Thanks to Dr. Stefan Gruenwald for this fascinating article.  According to Gruenwald, major changes are coming in the way we do GPS.  DARPA Director Arati Prabhakar said backup  plans are being made to keep the nation's digital network functioning after a "Carrington Event" super flare from the sun.  These changes will also affect the ARPS system used by amateur radio operators.  Prabhakar said DARPA is looking into these technologies:

 

Adaptable Navigation Systems.

Microtechnology for position, navigation, and timing.

Quantum-assisted sensing and readout (QUASAR).

The Program in ultrafast laser science and engineering.

The spatial, temporal, and overt information in contested environments  program.

 

Exciting times are ahead for the military, civilian industry, and even amateur radio operators.  Aloha de Russ (KH6JRM).

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Vision-correcting display makes reading glasses so yesterday

Vision-correcting display makes reading glasses so yesterday | Amazing Science | Scoop.it

What if computer screens had glasses instead of the people staring at the monitors? That concept is not too far afield from technology being developed by UC Berkeley computer and vision scientists.


The researchers are developing computer algorithms to compensate for an individual’s visual impairment, and creating vision-correcting displays that enable users to see text and images clearly without wearing eyeglasses or contact lenses. The technology could potentially help hundreds of millions of people who currently need corrective lenses to use their smartphones, tablets and computers. One common problem, for example, is presbyopia, a type of farsightedness in which the ability to focus on nearby objects is gradually diminished as the aging eyes’ lenses lose elasticity.


More importantly, the displays could one day aid people with more complex visual problems, known as high order aberrations, which cannot be corrected by eyeglasses, said Brian Barsky, UC Berkeley professor of computer science and vision science, and affiliate professor of optometry.


“We now live in a world where displays are ubiquitous, and being able to interact with displays is taken for granted,” said Barsky, who is leading this project. “People with higher order aberrations often have irregularities in the shape of the cornea, and this irregular shape makes it very difficult to have a contact lens that will fit. In some cases, this can be a barrier to holding certain jobs because many workers need to look at a screen as part of their work. This research could transform their lives, and I am passionate about that potential.”


“The significance of this project is that, instead of relying on optics to correct your vision, we use computation,” said lead author Fu-Chung Huang, who worked on this project as part of his computer science Ph.D. dissertation at UC Berkeley under the supervision of Barsky and Austin Roorda, professor of vision science and optometry. “This is a very different class of correction, and it is non-intrusive.”


The algorithm, which was developed at UC Berkeley, works by adjusting the intensity of each direction of light that emanates from a single pixel in an image based upon a user’s specific visual impairment. In a process called deconvolution, the light passes through the pinhole array in such a way that the user will perceive a sharp image.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Underwater self-healing polymer mimics biological self-repair of mussels

Underwater self-healing polymer mimics biological self-repair of mussels | Amazing Science | Scoop.it

A common acrylic polymer used in biomedical applications and as a substitute for glass has been given the ability to completely self-heal underwater by US researchers. The method, which takes inspiration from the self-healing abilities of adhesive proteins secreted by mussels, could allow for longer lasting biomedical implants. Temporary hydrogen bonding network stitches damage as the material fuses together.


'Polymer self-healing research is about 10 years old now and many different strategies have been developed,' says Herbert Waite, who conducted the work with colleagues at the University of California, Santa Barbara. 'None, however, address the need for healing in a wet medium – a critical omission as all biomaterials function, and fail, in wet environments.'


The idea of mimicking the biological self-healing ability of mussel adhesive proteins is not new, and previous attempts have involved polymer networks functionalised with catechols – synthetic water-soluble organic molecules that mimic mussel adhesive proteins – and metal-ion mediated bonding. However, how mussel adhesive proteins self-heal remains poorly understood, which has limited attempts to synthesise catechols that accurately mimic biological self-healing underwater.


Now, Waite and colleagues have discovered a new aspect of catechols after they were simply 'goofing around' in the lab and found a new way to modify the surface of poly(methyl methacrylate), or PMMA, with catechols. This led them to explore the material's properties and discover that hydrogen bonding enables the polymer to self-heal underwater after being damaged. 'Usually, catechols in wet adhesives are associated with covalent or coordination mediated cross-linking. Our results argue that hydrogen bonding can also be critical, especially as an initiator of healing,' he says.


The healing process begins because catechols provide multidentate hydrogen-bonding faces that trigger a network of hydrogen bonds to fix any damage – the interaction is strong enough to resist interference by water but reversible. Acting a bit like dissolvable stitches, hydrogen bonding between the catechols appears to stitch the damaged area, which allows the underlying polymer to fuse back together. After about 20 minutes, the hydrogen bonded catechols mysteriously disappear leaving the original site of damage completely healed. 'We don't know where the hydrogen bonded catechols go,’ Waite says. ‘Possibly back to the surface, dispersed within the bulk polymer, or some other possibility.'


Phillip Messersmith, a biomaterials expert at the University of California, Berkeley, US, says that this is ‘really creative work’. '[This] reveals a new dimension of catechols, which in this case mediate interfacial self-healing through the formation of hydrogen bonds between surfaces, and which are ultimately augmented or replaced by other types of adhesive interactions.'

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Weighing the Milky Way: Researchers devise precise method for calculating the mass of galaxies

Weighing the Milky Way: Researchers devise precise method for calculating the mass of galaxies | Amazing Science | Scoop.it

Does the Milky Way look fat in this picture? Has Andromeda been taking skinny selfies? Using a new, more accurate method for measuring the mass of galaxies, and international group of researchers has shown that the Milky Way has only half the Mass of the Andromeda Galaxy.


In previous studies, researchers were only able to estimate the mass of the Milky Way and Andromeda based on observations made using their smaller satellite dwarf galaxies. In the new study, researchers culled previously published data that contained information about the distances between the Milky Way, Andromeda and other close-by galaxies -- including those that weren't satellites -- that reside in and right outside an area referred to as the Local Group.


Galaxies in the Local Group are bound together by their collective gravity. As a result, while most galaxies, including those on the outskirts of the Local Group, are moving farther apart due to expansion, the galaxies in the Local Group are moving closer together because of gravity. For the first time, researchers were able to combine the available information about gravity and expansion to complete precise calculations of the masses of both the Milky Way and Andromeda.


"Historically, estimations of the Milky Way's mass have been all over the map," said Walker, an assistant professor of physics at Carnegie Mellon. "By studying two massive galaxies that are close to each other and the galaxies that surround them, we can take what we know about gravity and pair that with what we know about expansion to get an accurate account of the mass contained in each galaxy. This is the first time we've been able to measure these two things simultaneously."


By studying both the galaxies in and immediately outside the Local Group, Walker was able to pinpoint the group's center. The researchers then calculated the mass of both the ordinary, visible matter and the invisible dark matter throughout both galaxies based on each galaxy's present location within the Local Group. Andromeda had twice as much mass as the Milky Way, and in both galaxies 90 percent of the mass was made up of dark matter.


The study was supported by the UK's Science and Technology Facilities Council and led by Jorge Peñarrubia of the University of Edinburgh's School of Physics and Astronomy. Co-authors include Yin-Zhe Ma of the University of British Columbia and Alan McConnachie of the NRC Herzberg Institute of Astrophysics.


Carnegie Mellon University. "Weighing the Milky Way: Researchers devise precise method for calculating the mass of galaxies."

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Invertebrate numbers nearly halve as human population doubles

Invertebrate numbers nearly halve as human population doubles | Amazing Science | Scoop.it

Invertebrate numbers have decreased by 45% on average over a 35 year period in which the human population doubled, reports a study on the impact of humans on declining animal numbers.


This decline matters because of the enormous benefits invertebrates such as insects, spiders, crustaceans, slugs and worms bring to our day-to-day lives, including pollination and pest control for crops, decomposition for nutrient cycling, water filtration and human health.


The study, published in Science and led by UCL, Stanford and UCSB, focused on the demise of invertebrates in particular, as large vertebrates have been extensively studied. They found similar widespread changes in both, with an on-going decline in invertebrates surprising scientists, as they had previously been viewed as nature’s survivors.


The decrease in invertebrate numbers is due to two main factors – habitat loss and climate disruption on a global scale. In the UK alone, scientists noted the areas inhabited by common insects such as beetles, butterflies, bees and wasps saw a 30-60% decline over the last 40 years.


Scientists believe there is a growing understanding of how ecosystems are changing but to tackle these issues, better predictions of the impact of changes are needed together with effective policies to reverse the losses currently seen. Using this approach, conservation of species can be prioritized with the benefit of protecting processes that serve human needs, and successful campaigns scaled-up to effect a positive change globally.

more...
No comment yet.
Rescooped by Dr. Stefan Gruenwald from Brain Tricks: Belief, Bias, and Blindspots
Scoop.it!

Mapping the optimal route between two quantum states

Mapping the optimal route between two quantum states | Amazing Science | Scoop.it
As a quantum state collapses from a quantum superposition to a classical state or a different superposition, it will follow a path known as a quantum trajectory.


In a recent paper in Nature, scientists from the University of Rochester, University of California at Berkeley and Washington University in St. Louis have shown that it is possible to track these quantum trajectories and compare them to a recently developed theory for predicting the most likely path a system will take between two states.


Andrew N. Jordan, professor of physics at the University of Rochester and one of the authors of the paper, and his group had developed this new theory in an earlier paper. The results published this week show good agreement between theory and experiment.


For their experiment, the Berkeley and Washington University teams devised a superconducting qubit with exceptional coherence properties, permitting it to remain in a quantum superposition during the continuous monitoring. The experiment actually exploited the fact that any measurement will perturb a quantum system. This means that the optimal path will come about as a result of the continuous measurement and how the system is being driven from one quantum state to another.


Kater Murch, co-author and assistant professor at Washington University in St. Louis, explained that a key part of the experiment was being able to measure each of these trajectories while the system was changing, something that had not been possible until now.


Jordan compares the experiment to watching butterflies make their way one by one from a cage to nearby trees. "Each butterfly's path is like a single run of the experiment," said Jordan. "They are all starting from the same cage, the initial state, and ending in one of the trees, each being a different end state." By watching the quantum equivalent of a million butterflies make the journey from cage to tree, the researchers were in effect able to predict the most likely path a butterfly took by observing which tree it landed on (known as post-selection in quantum physics measurements), despite the presence of a wind, or any disturbance that affects how it flies, which is similar to the effect measuring has on the system.


"The experiment demonstrates that for any choice of final quantum state, the most likely or 'optimal path' connecting them in a given time can be found and predicted," said Jordan. "This verifies the theory and opens the way for active quantum control techniques." He explained that only if you know the most likely path is it possible to set up the system to be in the desired state at a specific time.


Via Jocelyn Stoller
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

The social origins of intelligence in the brain

The social origins of intelligence in the brain | Amazing Science | Scoop.it

By studying the injuries and aptitudes of Vietnam War veterans who suffered penetrating head wounds during the war, researchers have found that brain regions that contribute to optimal social functioning are also vital to general intelligence and emotional intelligence.


This finding, reported in the journal Brain, bolsters the view that general intelligence emerges from the emotional and social context of one’s life.

“We are trying to understand the nature of general intelligence and to what extent our intellectual abilities are grounded in social cognitive abilities,” said Aron Barbey, a University of Illinois professor of neuroscience, psychology, and speech and hearing science.


Barbey, an affiliate of the Beckman Institute and he Institute for Genomic Biology at the University of Illinois, led the new study with an international team of collaborators.


The study involved 144 Vietnam veterans injured by shrapnel or bullets that penetrated the skull, damaging distinct brain tissues while leaving neighboring tissues intact. Using CT scans, the scientists painstakingly mapped the affected brain regions of each participant, then pooled the data to build a collective map of the brain.


The researchers used a battery of carefully designed tests to assess participants’ intellectual, emotional and social capabilities. They then looked for damage in specific brain regions tied to deficits in the participants’ ability to navigate intellectual, emotional or social realms. Social problem solving in this analysis primarily involved conflict resolution with friends, family and peers at work.


As in their earlier studies of general intelligence and emotional intelligence, the researchers found that regions of the frontal cortex (at the front of the brain), the parietal cortex (further back near the top of the head) and the temporal lobes (on the sides of the head behind the ears) are all implicated in social problem solving. The regions that contributed to social functioning in the parietal and temporal lobes were located only in the brain’s left hemisphere, while both left and right frontal lobes were involved.

more...
No comment yet.
Rescooped by Dr. Stefan Gruenwald from Brain Tricks: Belief, Bias, and Blindspots
Scoop.it!

Can Winograd Schemas Replace Turing Test for Defining Human-Level AI?

Can Winograd Schemas Replace Turing Test for Defining Human-Level AI? | Amazing Science | Scoop.it

Earlier this year, a chatbot called Eugene Goostman "beat" a Turing Test for artificial intelligence as part of a contest organized by a U.K. university. Almost immediately, it became obvious that rather than proving that a piece of software had achieved human-level intelligence, all that this particular competition had shown was that a piece of software had gotten fairly adept at fooling humans into thinking that they were talking to another human, which is very different from a measure of the ability to "think." In fact, some observers didn't think the bot was very clever at all.


Clearly, a better test is needed, and we may have one, in the form of a type of question called a Winograd schema that's easy for a human to answer, but a serious challenge for a computer.


The problem with the Turing Test is that it's not really a test of whether an artificial intelligence program is capable of thinking: it's a test of whether an AI program can fool a human. And humans are really, really dumb. We fall for all kinds of tricks that a well-programmed AI can use to convince us that we're talking to a real person who can think.


For example, the Eugene Goostman chatbot pretends to be a 13-year-old boy, because 13-year-old boys are often erratic idiots (I've been one), and that will excuse many circumstances in which the AI simply fails. So really, the chat bot is not intelligent at all—it's just really good at making you overlook the times when it's stupid, while emphasizing the periodic interactions when its algorithm knows how to answer the questions that you ask it.


Conceptually, the Turing Test is still valid, but we need a better practical process for testing artificial intelligence. A new AI contest, sponsored by Nuance Communications and CommonsenseReasoning.org, is offering a US $25,000 prize to an AI that can successfully answer what are called Winograd schemas, named after Terry Winograd, a professor of computer science at Stanford University.


Via Jocelyn Stoller
more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

TorrentChat 4 Free: Bleep will provide totally secure, totally peer-to-peer chatting from BitTorrent

TorrentChat 4 Free: Bleep will provide totally secure, totally peer-to-peer chatting from BitTorrent | Amazing Science | Scoop.it

The world's most successful data transfer protocol could underlie the next generation chat client: Bleep will provide totally secure, totally peer-to-peer chatting from BitTorrent.


BitTorrent, creator of the protocol that now handles more than a third of all internet traffic, has a new product called Bleep on the horizon. It aims to bring the distributed, anonymous technology that made BitTorrent so successful to the oldest action in the history of the internet: chat. Using the same BitTorrent connection logic that has allowed you to pirate TV shows and movies for over a decade, Bleep will facilitate direct, encrypted connections directly between peers, meaning that no outside observer ever gets its hands on your words. To everyone buy the intended recipient, your words are effectively “bleeped.” This could be big news for whistleblowers who are trying to keep their identity secret, for businesses that want to ensure the confidentiality of their communications, or just for normal people who want to escape the ever-watchful eye of the NSA.


Encrypted chat programs like Bleep, or even long-standing encrypted email schemes, are generally pretty difficult to use. If you wanted to send me a totally secure email, you’d need to visit by Twitter account for a PGP key (generously hosted at an external MIT key-server), use that to add me as an encrypted messaging buddy, then use specialized email software to send/receive messages. Most of the difficulty in sending secure messages comes from the fact that those emails must pass (unreadable) through a number of third parties — but BitTorrent’s whole addition to the tech sphere was its circumvention of unnecessary servers to allow direct peer-to-peer (p2p) communication.


As soon as you bring third parties into the system (i.e. remote servers), you bring trust into the equation, and exponentially increase the ways your communications could be attacked. If your conversations are all flowing through some Google, Microsoft, or Apple server somewhere, then it doesn’t really matter how well you protect things on your end; if the NSA/Snowden leaks have taught us anything, it’s that third parties can readily reveal your communications. Even encrypted chat services like ChatCrypt don’t fully get around this problem — though to be fair, they do a pretty good job.


With Bleep, the creators use something called a Distributed Hash Table (DHT) to basically associate public encryption keys (you’ll still need those) with IP addresses. Using this information (encryption+online location) the BitTorrent protocol can establish a direct link between two users with no intermediaries. BitTorrent says there will be absolutely no record of the IP-lookup (this would be a piece of metadata), as each user finds the other through the network’s many distributed nodes rather than a central lookup server. This system might work via a one-time lookup per user, or require a DHT-check to establish a connection at the beginning of every conversation; the documentation is still quite vague.


BitTorrent, as an idea, is sort of the apex predator of modern data-giants like Google. Encryption and p2p tech has not hurt Google much because, frankly, it’s always been too clunky to catch on for any large proportion of users’ time. On the other hand, Bleep will offer features like importing your Google contacts list for easy setup; people will still need to generate an encryption key-pair, but as it becomes more practically feasible to pass around more and more types of data without any assistance, it will ruffle more and more feathers. Don’t be surprised if those pro-tech super-PACs everyone’s so excited about end up opposing anti-cloud efforts like this one, especially when BitTorrent takes its thinking to the logical conclusion and releases a competitor to the onion routing protocol – i.e. Tor – which would allow a fully p2p web browser.


You can sign up for an early-access list for Bleep, but there’s no telling how long you might wait, and you’ll need to convince some friends to be as up in arms about privacy as you are. Though it’s still in a closed alpha phase, Bleep is built on the framework of the old BitTorrent Chat experiment, so it has already had extensive testing in its basic functionality. For those who want it, Bleep could be a near-perfect solution — but relying on supposedly impregnable software has burned many people in the past. We’ll see how well Bleep can measure up to the unforgiving storm of cyber-attacks that come to bear on virtually every “secure” software ever made.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

A new way to make microstructured surfaces

A new way to make microstructured surfaces | Amazing Science | Scoop.it

A team of researchers has created a new way of manufacturing microstructured surfaces that have novel three-dimensional textures. These surfaces, made by self-assembly of carbon nanotubes, could exhibit a variety of useful properties — including controllable mechanical stiffness and strength, or the ability to repel water in a certain direction.


“We have demonstrated that mechanical forces can be used to direct nanostructures to form complex three-dimensional microstructures, and that we can independently control … the mechanical properties of the microstructures,” says A. John Hart, the Mitsui Career Development Associate Professor of Mechanical Engineering at MIT and senior author of a paper describing the new technique in the journal Nature Communications.


The technique works by inducing carbon nanotubes to bend as they grow. The mechanism is analogous to the bending of a bimetallic strip, used as the control in old thermostats, as it warms: One material expands faster than another bonded to it.  But in this new process, the material bends as it is produced by a chemical reaction. 


The process begins by printing two patterns onto a substrate: One is a catalyst of carbon nanotubes; the second material modifies the growth rate of the nanotubes. By offsetting the two patterns, the researchers showed that the nanotubes bend into predictable shapes as they extend.


“We can specify these simple two-dimensional instructions, and cause the nanotubes to form complex shapes in three dimensions,” says Hart. Where nanotubes growing at different rates are adjacent, “they push and pull on each other,” producing more complex forms, Hart explains. “It’s a new principle of using mechanics to control the growth of a nanostructured material,” he says.


Few high-throughput manufacturing processes can achieve such flexibility in creating three-dimensional structures, Hart says. This technique, he adds, is attractive because it can be used to create large expanses of the structures simultaneously; the shape of each structure can be specified by designing the starting pattern. Hart says the technique could also enable control of other properties, such as electrical and thermal conductivity and chemical reactivity, by attaching various coatings to the carbon nanotubes after they grow.


“If you coat the structures after the growth process, you can exquisitely modify their properties,” says Hart. For example, coating the nanotubes with ceramic, using a method called atomic layer deposition, allows the mechanical properties of the structures to be controlled. “When a thick coating is deposited, we have a surface with exceptional stiffness, strength, and toughness relative to [its] density,” Hart explains. “When a thin coating is deposited, the structures are very flexible and resilient.”   

This approach may also enable “high-fidelity replication of the intricate structures found on the skins of certain plants and animals,” Hart says, and could make it possible to mass-produce surfaces with specialized characteristics, such as the water-repellent and adhesive ability of some insects. “We’re interested in controlling these fundamental properties using scalable manufacturing techniques,” Hart says.  

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Largest ever Ebola outbreak is NOT a global threat

Largest ever Ebola outbreak is NOT a global threat | Amazing Science | Scoop.it
Although the virus is exerting a heavy toll in West Africa, it does not spread easily.


Can Ebola spread on planes?


Deadly Ebola probably touched down in Lagos, Nigeria, the largest city in Africa, on 20 July. A man who was thought to be infected with the virus had arrived there on a flight from Liberia, where, along with Guinea and Sierra Leone, the largest recorded Ebola outbreak is currently raging. The Lagos case is the first to be internationally exported by air travel and today the UK foreign secretary announced that he would chair a government meeting on Ebola. As long as the virus continues to infect people in Liberia, Guinea and Sierra Leone, there is a small risk of more long-distance exports of the disease. But, Ebola does not pose a global threat.


The World Health Organization still considers the Lagos case a “probable” infection because it has not yet confirmed that the 40-year-old Liberian man had Ebola. He was quarantined upon arrival at the airport and taken to hospital, where he died on 25 July. Assuming he had Ebola, if proper control measures were taken at the airport and at the hospital, the risk that health-care workers or others will become infected as a result of contact with him is low. The European Centre for Disease Prevention and Control classifies people sharing public transport with someone infected as having a “very low” risk of catching the virus. Healthcare workers and doctors, several of whom have now been infected and died as a result of caring for people in the current outbreak, are at much higher risk and the WHO advises that they take strict precautions, which greatly lowers the risk.


The ECDC also says the probability of an infected person getting on a flight in the first place is low,given the small overall number of Ebola cases. Moreover, functional health systems should be able to prevent onward spread from any exported cases. Overall, the World Health Organisation estimates that there is a high risk of spread to countries bordering those with existing outbreaks, a moderate risk to countries further afield in the sub-region, but that there is little chance of spread overseas. There is no reason to assume that an exported case — be it to Lagos, a city of 17 million people, or any other place — will spark new outbreaks, because Ebola is not highly contagious.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Screw-shaped nanopropeller can actively move in a gel-like fluid

Screw-shaped nanopropeller can actively move in a gel-like fluid | Amazing Science | Scoop.it

Schematic of micro- and nanopropellers in hyaluronan gels. The polymeric mesh structure blocks the larger micropropellers (top left), but smaller propellers.


Israeli and German researchers have created a nanoscale screw-shaped propeller that can move in a gel-like fluid, mimicking the environment inside a living organism, as described in a paper published in the June 2014 issue of ACS Nano.


The team comprises researchers from Technion, the Max Planck Institute for Intelligent Systems, and the Institute for Physical Chemistry at the University of Stuttgart.


The filament that makes up the propeller, made of silica and nickel, is only 70 nanometers in diameter; the entire propeller is 400 nanometers long, small enough that their motion can be affected by Brownian motion of nearby molecules.


To test if the propellers could move through living organisms, they used hyaluronan, a material that occurs throughout the human body, including the synovial fluids in joints and the vitreous humor in your eyeball.


The hyaluronan gel contains a mesh of long proteins called polymers; the polymers are large enough to prevent micron-sized (millionths of a meter) propellers from moving much at all. But the openings are large enough for nanometer-sized objects to pass through. The scientists were able to control the motion of the propellers using a relatively weak rotating magnetic field.


“One can now think about targeted applications, for instance, in the eye, where they may be moved to a precise location at the retina,” says Peer Fischer, a member of the research team and head of the Micro, Nano, and Molecular Systems Lab at the Max Planck Institute for Intelligent Systems.


Scientists could also attach “active molecules” to the tips of the propellers, or use the propellers to deliver tiny targeted doses of radiation.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Sverker Johansson and his Bots have created 2.7 million Wikipedia entries

Sverker Johansson and his Bots have created 2.7 million Wikipedia entries | Amazing Science | Scoop.it
A 53-year-old Swede can take credit for 2.7 million articles on Wikipedia, but some "purists" complain about his method.


Sverker Johansson could be the most prolific author you've never heard of. Volunteering his time over the past seven years publishing to Wikipedia, the 53-year-old Swede can take credit for 2.7 million articles, or 8.5% of the entire collection, according to Wikimedia analytics, which measures the site's traffic. His stats far outpace any other user, the group says.


He has been particularly prolific cataloging obscure animal species, including butterflies and beetles, and is proud of his work highlighting towns in the Philippines. About one-third of his entries are uploaded to the Swedish language version of Wikipedia, and the rest are composed in two versions of Filipino, one of which is his wife's native tongue. An administrator holding degrees in linguistics, civil engineering, economics and particle physics, he says he has long been interested in "the origin of things, oh, everything."


It isn't uncommon, however, for Wikipedia purists to complain about his method. That is because the bulk of his entries have been created by a computer software program—known as a bot. Critics say bots crowd out the creativity only humans can generate.


Mr. Johansson's program scrubs databases and other digital sources for information, and then packages it into an article. On a good day, he says his "Lsjbot" creates up to 10,000 new entries.


On Wikipedia, any registered user can create an entry. Mr. Johansson has to find a reliable database, create a template for a given subject and then launch his bot from his computer. The software program searches for information, then publishes it to Wikipedia.


Bots have long been used to author and edit entries on Wikipedia, and, more recently, an increasingly large amount of the site's new content is written by bots. Their use is regulated by Wikipedia users called the "Bot Approvals Group."


While Mr. Johansson works to achieve consensus approval for his project, he and his bot-loving peers expect to continue facing resistance. "There is a vocal minority who don't like it," he said during a recent speech on his work. Still, he soldiers on.


"I'm doing this to create absolute democracy online," Mr. Johansson said recently while sitting in front of a computer at his office at Sweden's Dalarna University.


Wikipedia, he reckons, should someday be able to tell people everything about everything. His bot, which took him months' worth of programming to create, is a step toward achieving that goal sooner rather than later—even if the entries it creates are bare-boned "stubs" containing basic information.


Achim Raschka is one of the people who would like Mr. Johansson to change course. The 41-year-old German Wikipedia enthusiast can spend days writing an in-depth article about a single type of plant.


"I am against production of bot-generated stubs in general," he said. He is particularly irked by Mr. Johansson's Lsjbot, which prizes quantity over quality and is "not helping the readers and users of Wikipedia."

more...
No comment yet.
Rescooped by Dr. Stefan Gruenwald from #Communication
Scoop.it!

Robots + AI + AofA: The astounding athletic power of quadcopters - Raffaello D'Andrea

In a robot lab at TEDGlobal, Raffaello D'Andrea demos his flying quadcopters: robots that think like athletes, solving physical problems with mathematical algorithms and AI.


Quadcopter flight assembled architecture


Via Prof. Hankell
more...
Prof. Hankell's curator insight, July 28, 2:04 PM

We are beginning to see autonomous technology and artificial intelligence that we will interact with as we would with other people...

Scooped by Dr. Stefan Gruenwald
Scoop.it!

Relativity behind mercury's liquidity

Relativity behind mercury's liquidity | Amazing Science | Scoop.it

Why is mercury a liquid at room temperature? If you ask that question in a school classroom you will probably be told that relativity affects the orbitals of heavy metals, contracting them and changing how they bond. However, the first evidence that this explanation is correct has only just been published.


In the 1960s, Pekka Pyykkö, now at University of Helsinki, Finland, discovered that gold’s colour was the result of relativistic effects. He showed that the lower energy levels of the 6s orbital of gold means that the energy required to excite an electron from the 5d band lies in the visible rather than UV range of light. This means that gold absorbs blue light, while reflecting yellow and red light, and it is this that gives the metal its characteristic hue. If the energies of the two bands were calculated without including relativistic effects, the energy required is much greater. Further calculations have subsequently shown the influence of relativity on the colour and bond lengths of heavy metal compounds, as well as its importance in catalysis. However, the low melting point of mercury could still only be described as ‘probably’ due to relativistic effects.


An international team led by Peter Schwerdtfeger of Massey University Auckland in New Zealand used quantum mechanics to make calculations of the heat capacity of the metal either including or excluding relativistic effects. They showed that if they ignored relativity when making their calculations, the predicted melting point of mercury was 82°C. But if they included relativistic effects their answer closely matched the experimental value of -39°C.


Relativity states that objects get heavier the faster they move. In atoms, the velocity of the innermost electrons is related to the nuclear charge. The larger the nucleus gets the greater the electrostatic attraction and the faster the electrons have to move to avoid falling into it. So, as you go down the periodic table these 1s electrons get faster and faster, and therefore heavier, causing the radius of the atom to shrink. This stabilises some orbitals, which also have a relativistic nature of their own, while destabilising others. This interplay means that for heavy elements like mercury and gold, the outer electrons are stabilised. In mercury’s case, instead of forming bonds between neighbouring mercury atoms, the electrons stay associated with their own nuclei, and weaker interatomic forces such as van der Waals bonds hold the atoms together.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Diet Affects Men’s and Women’s Intestinal Microbes Differently

Diet Affects Men’s and Women’s Intestinal Microbes Differently | Amazing Science | Scoop.it

 The microbes living in the guts of males and females react differently to diet, even when the diets are identical, according to a study by scientists from The University of Texas at Austin and six other institutions published this week in the journal Nature CommunicationsThese results suggest that therapies designed to improve human health and treat diseases through nutrition might need to be tailored for each sex.


The researchers studied the gut microbes in two species of fish and in mice, and also conducted an in-depth analysis of data that other researchers collected on humans. They found that in fish and humans diet affected the microbiota of males and females differently. In some cases, different species of microbes would dominate, while in others, the diversity of bacteria would be higher in one sex than the other.

These results suggest that any therapies designed to improve human health through diet should take into account whether the patient is male or female.


Only in recent years has science begun to completely appreciate the importance of the human microbiome, which consists of all the bacteria that live in and on people’s bodies. There are hundreds or even thousands of species of microbes in the human digestive system alone, each varying in abundance.


Genetics and diet can affect the variety and number of these microbes in the human gut, which can in turn have a profound influence on human health. Obesity, diabetes, and inflammatory bowel disease have all been linked to low diversity of bacteria in the human gut.


One concept for treating such diseases is to manipulate the microbes within a person’s gut through diet. The idea is gaining in popularity because dietary changes would make for a relatively cheap and simple treatment.


Much has to be learned about which species, or combination of microbial species, is best for human health. In order to accomplish this, research has to illuminate how these microbes react to various combinations of diet, genetics and environment. Unfortunately, to date most such studies only examine one factor at a time and do not take into account how these variables interact.


“Our study asks not just how diet influences the microbiome, but it splits the hosts into males and females and asks, do males show the same diet effects as females?” said Daniel Bolnick, professor in The University of Texas at Austin's College of Natural Sciences and lead author of the study.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

World's first artificial leaves uses chloroplasts from real plants and photosynthesis

World's first artificial leaves uses chloroplasts from real plants and photosynthesis | Amazing Science | Scoop.it

The Silk Leaf project uses chloroplasts from real plants suspended in silk proteins to create a hardy vehicle for photosynthesis.


The leaves, created by Royal College of Art student Julian Melchiorri, absorb water and carbon dioxide just like real plants but are made from tough silk proteins that could let them survive space voyages.


Melchiorri explains: "NASA is researching different ways to produce oxygen for long-distance space journeys to let us live in space. This material could allow us to explore space much further than we can now."


The Silk Leaf project was engineered in collaboration with Tufts University silk lab, which helped Melchiorri extract chloroplasts from real leaves and suspend them in a silk matrix.


"The material is extracted directly from the fibres of silk," explains Melchiorri. "This material has an amazing property of stabilising molecules. I extracted chloroplasts from plant cells and placed them inside this silk protein. As an outcome I have the first photosynthetic material that is living and breathing as a leaf does."


Chloroplasts are the parts of plant cells that conduct photosynthesis, using the energy of the sun to turn carbon dioxide and water to create glucose and oxygen.


Melchiorri’s creations are currently more conceptual than practical (the efficiency of the photosynthesis process hasn’t been tested for one) but he hopes they could be used in all manner of futuristic architectural projects, perhaps even deploying giant leaves as air filters, hanging them on the exterior of buildings to absorb CO2 and channel fresh air inside.

more...
No comment yet.
Scooped by Dr. Stefan Gruenwald
Scoop.it!

Mystery Si-C Molecules of the Interstellar Medium: Many of the Things Quite Abundant There are Unknown on Earth

Mystery Si-C Molecules of the Interstellar Medium: Many of the Things Quite Abundant There are Unknown on Earth | Amazing Science | Scoop.it

Astronomers have long known that interstellar molecules containing carbon atoms exist and that by their nature they will absorb light shining on them from stars and other luminous bodies. Because of this, a number of scientists have previously proposed that some type of interstellar molecules are the source of diffuse interstellar bands -- the hundreds of dark absorption lines seen in color spectrograms taken from Earth. In showing nothing, these dark bands reveal everything. The missing colors correspond to photons of given wavelengths that were absorbed as they travelled through the vast reaches of space before reaching us. More than that, if these photons were filtered by falling on space-based molecules, the wavelengths reveal the exact energies it took to excite the electronic structures of those absorbing molecules in a defined way.


Over the vast, empty reaches of interstellar space, these countless small molecules tumble quietly though the cold vacuum. The interstellar medium is the matter that exists in the space between the star systems in a galaxy. This matter includes gas in ionic, atomic, and molecular form, dust, and cosmic rays. Forged in the fusion furnaces of ancient stars and ejected into space when those stars exploded, these lonely molecules account for a significant amount of all the carbon, hydrogen, silicon and other atoms in the universe. In fact, some 20 percent of all the carbon in the universe is thought to exist as some form of interstellar molecule.


Many astronomers hypothesize that these interstellar molecules are also responsible for an observed phenomenon on Earth known as the "diffuse interstellar bands," spectrographic proof that something out there in the universe is absorbing certain distinct colors of light from stars before it reaches the Earth. But since we don't know the exact chemical composition and atomic arrangements of these mysterious molecules, it remains unproven whether they are, in fact, responsible for the diffuse interstellar bands.


Now in a paper appearing this week in The Journal of Chemical Physics, a group of scientists led by researchers at the Harvard-Smithsonian Center for Astrophysics has offered a tantalizing new possibility: these mysterious molecules may be silicon-capped hydrocarbons like SiC3H, SiC4H and SiC5H, and they present data and theoretical arguments to back that hypothesis. At the same time, the group cautions that history has shown that while many possibilities have been proposed as the source of diffuse interstellar bands, none has been proven definitively.


Armed with that information, scientists here on Earth should be able to use spectroscopy to identify those interstellar molecules -- by demonstrating which molecules in the laboratory have the same absorptive "fingerprints." But despite decades of effort, the identity of the molecules that account for the diffuse interstellar bands remains a mystery. Nobody has been able to reproduce the exact same absorption spectra in laboratories here on Earth.


"Not a single one has been definitively assigned to a specific molecule," said Neil Reilly, a former postdoctoral fellow at Harvard-Smithsonian Center for Astrophysics and a co-author of the new paper. Now Reilly, McCarthy and their colleagues are pointing to an unusual set of molecules — silicon-terminated carbon chain radicals — as a possible source of these mysterious bands.


As they report in their new paper, the team first created silicon-containing carbon chains SiC3H, SiC4H and SiC5H in the laboratory using a jet-cooled silane-acetylene discharge. They then analyzed their spectra and carried out theoretical calculations to predict that longer chains in this family might account for some portion of the diffuse interstellar bands.


more...
No comment yet.